Update README.md
Browse files
README.md
CHANGED
@@ -22,9 +22,9 @@ ExpertRamonda-7Bx2_MoE is a Mixure of Experts (MoE) made with the following mode
|
|
22 |
# 🏆 Benchmarks
|
23 |
|
24 |
### Open LLM Leaderboard
|
25 |
-
| Model | Average | ARC_easy | HellaSwag | MMLU |
|
26 |
|------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
|
27 |
-
| mayacinka/ExpertRamonda-7Bx2_MoE |
|
28 |
|
29 |
|
30 |
### MMLU
|
|
|
22 |
# 🏆 Benchmarks
|
23 |
|
24 |
### Open LLM Leaderboard
|
25 |
+
| Model | Average | ARC_easy | HellaSwag | MMLU | TruthfulQA_mc2 | Winogrande | GSM8K |
|
26 |
|------------------------|--------:|-----:|----------:|-----:|-----------:|-----------:|------:|
|
27 |
+
| mayacinka/ExpertRamonda-7Bx2_MoE | 78.10 | 86.87 | 87.51| 61.63 | 78.02 | 81.85 | 72.71|
|
28 |
|
29 |
|
30 |
### MMLU
|