palmer-004-original / README.md
appvoid's picture
Update README.md
7bc24b1 verified
|
raw
history blame
1.85 kB
metadata
base_model:
  - appvoid/palmer-002-32k
  - raidhon/coven_tiny_1.1b_32k_orpo_alpha
  - appvoid/palmer-003
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0

palmer-004

Introducing palmer-004

We are thrilled to unveil palmer-004, the fourth iteration in our esteemed palmer series, designed to push the boundaries of performance and context window capacity. Merging the best of palmer-003 with the expansive capabilities of palmer-002-32k, palmer-004 is a testament to the power of merging similar trained models, think of it as palmer-003 with 32k context with minimal performance degradation.

Model MMLU ARC-C HellaSwag PIQA Winogrande
tinyllama-3t 0.2577 0.3029 0.5935 0.7329 0.5959
palmer-003 0.2523 0.3439 0.6208 0.7524 0.6590
palmer-004 0.2601 0.3456 0.6138 0.7443 0.6511

Crafted using the TIES merge method, palmer-004 elevates its predecessor's prowess to a remarkable 32k context window without needing to train on additional data. This model doesn't just aim to match; it surpasses, outperforming the current state-of-the-art coven_tiny_1.1b_32k_orpo_alpha on several benchmarks. As of June 1, 2024, palmer-004 stands as the 2nd best 1b overall model as well as the best 32k 1B model to fine-tune from on Hugging Face, setting a new standard for excellence in the open-source AI community.