Update README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ tags:
|
|
9 |
---
|
10 |
![palmer-003 logo](https://huggingface.co/appvoid/palmer-002.5/resolve/main/003.png)
|
11 |
|
12 |
-
Creative writing has never been so accesible, palmer goes beyond what it was thought about small language models. This model is a "MErging of Experts" (MEoE) using an internal model `palmer-003` as base, biased as an assistant, using dpo technique, without using any prompts—as a result of these efforts—palmer is better than most 1b language models on most benchmarks, despite being sometimes 40% smaller than its counterparts.
|
13 |
|
14 |
```
|
15 |
Model MMLU ARC-C OBQA HellaSwag PIQA Winogrande Average Params
|
|
|
9 |
---
|
10 |
![palmer-003 logo](https://huggingface.co/appvoid/palmer-002.5/resolve/main/003.png)
|
11 |
|
12 |
+
Creative writing has never been so accesible, palmer goes beyond what it was thought about small language models. This model is a "MErging of Experts" (MEoE) using an internal model `palmer-003-2401` as base, biased as an assistant, using dpo technique, without using any prompts—as a result of these efforts—palmer is better than most 1b language models on most benchmarks, despite being sometimes 40% smaller than its counterparts.
|
13 |
|
14 |
```
|
15 |
Model MMLU ARC-C OBQA HellaSwag PIQA Winogrande Average Params
|