Update README.md
Browse files
README.md
CHANGED
@@ -18,12 +18,17 @@ ramonda-7b-dpo-ties is a merge of the following models using [LazyMergekit](http
|
|
18 |
* [bardsai/jaskier-7b-dpo-v4.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3)
|
19 |
|
20 |
## Benchmark
|
21 |
-
|
22 |
-
|
23 |
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
24 |
|------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
|
25 |
| mayacinka/ramonda-7b-dpo-ties | 76.19 | 72.7 | 89.69| 64.5 | 77.17 | 84.77 | 68.92|
|
26 |
|
|
|
|
|
|
|
|
|
|
|
|
|
27 |
|
28 |
## 🧩 Configuration
|
29 |
|
|
|
18 |
* [bardsai/jaskier-7b-dpo-v4.3](https://huggingface.co/bardsai/jaskier-7b-dpo-v4.3)
|
19 |
|
20 |
## Benchmark
|
21 |
+
[Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
|
|
22 |
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
23 |
|------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
|
24 |
| mayacinka/ramonda-7b-dpo-ties | 76.19 | 72.7 | 89.69| 64.5 | 77.17 | 84.77 | 68.92|
|
25 |
|
26 |
+
[LLM AutoEval](https://gist.github.com/majacinka/370282a808a21b28bacd2c76a998da8f)
|
27 |
+
| Model | AGIEval | GPT4All | TruthfulQA | Bigbench | Average |
|
28 |
+
|----------------------|---------|---------|------------|----------|---------|
|
29 |
+
| ramonda-7b-dpo-ties | 44.67 | 77.16 | 77.6 | 49.06 | 62.12 |
|
30 |
+
|
31 |
+
|
32 |
|
33 |
## 🧩 Configuration
|
34 |
|