appvoid commited on
Commit
a630151
1 Parent(s): 276ffe5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -11,10 +11,10 @@ tags:
11
  ---
12
  # palmer
13
 
14
- palmer-004 a is merge of models targetting to get the performance of palmer-003 all they way to 32k context window. It was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [palmer-002-32k](https://huggingface.co/appvoid/palmer-002-32k) as a base.
15
 
16
- palmer-004 performs better than coven_tiny_1.1b_32k_orpo_alpha which is the current sota at open-llm-leaderboard, making this one the best overall 1b model on huggingface as of 06/01/2024.
17
 
18
- The following models were included in the merge:
19
- * [coven_tiny_1.1b_32k_orpo_alpha](https://huggingface.co/raidhon/coven_tiny_1.1b_32k_orpo_alpha)
20
- * [palmer-003](https://huggingface.co/appvoid/palmer-003)
 
11
  ---
12
  # palmer
13
 
14
+ Introducing palmer-004: A Symphony of Innovation in Open-Source AI
15
 
16
+ We are thrilled to unveil palmer-004, the fourth iteration in our esteemed Palmer series, designed to push the boundaries of performance and context window capacity. Merging the best of palmer-003 with the expansive capabilities of palmer-002-32k, palmer-004 is a testament to the power of innovation and meticulous craftsmanship.
17
 
18
+ Crafted using the TIES merge method, palmer-004 elevates its predecessor's prowess to a remarkable 32k context window. This model doesn't just aim to match; it surpasses, outperforming the current state-of-the-art coven_tiny_1.1b_32k_orpo_alpha. As of June 1, 2024, palmer-004 stands as the premier 1B model on Hugging Face, setting a new standard for excellence in the open-source AI community.
19
+
20
+ Experience the future of language modeling with palmer-004 – a fusion of unparalleled performance and cutting-edge technology, offered to you for free. Join us in this exciting journey of innovation and discovery.