eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
2 values
Architecture
stringclasses
47 values
Model
stringlengths
355
650
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.41
51.2
Hub License
stringclasses
24 values
Hub ❤️
int64
0
5.82k
#Params (B)
int64
-1
140
Available on the hub
bool
2 classes
Not_Merged
bool
2 classes
MoE
bool
2 classes
Flagged
bool
1 class
Chat Template
bool
2 classes
CO₂ Emissions for Evaluation (kg)
float64
0.04
107
IFEval Raw
float64
0
0.87
IFEval
float64
0
86.7
BBH Raw
float64
0.28
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.41
GPQA
float64
0
21.6
MUSR Raw
float64
0.29
0.59
MUSR
float64
0
36.4
MMLU-PRO Raw
float64
0.1
0.7
MMLU-PRO
float64
0
66.8
Maintainer's Highlight
bool
2 classes
Upload To Hub Date
stringlengths
0
10
Submission Date
stringclasses
142 values
Generation
int64
0
6
Base Model
stringlengths
4
102
CohereForAI_aya-expanse-8b_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/aya-expanse-8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/aya-expanse-8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__aya-expanse-8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/aya-expanse-8b
b9848575c8731981dfcf2e1f3bfbcb917a2e585d
22.142223
cc-by-nc-4.0
269
8
true
true
true
false
true
1.169689
0.635852
63.585176
0.49772
28.523483
0.070242
7.024169
0.302852
7.04698
0.372885
4.410677
0.300366
22.262855
true
2024-10-23
2024-10-24
0
CohereForAI/aya-expanse-8b
CohereForAI_c4ai-command-r-plus_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus
fa1bd7fb1572ceb861bbbbecfa8af83b29fa8cca
30.961247
cc-by-nc-4.0
1,681
103
true
true
true
false
true
28.631532
0.766419
76.641866
0.581542
39.919954
0.081571
8.1571
0.305369
7.38255
0.480719
20.423177
0.399186
33.242834
true
2024-04-03
2024-06-13
0
CohereForAI/c4ai-command-r-plus
CohereForAI_c4ai-command-r-plus-08-2024_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-plus-08-2024" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-plus-08-2024</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-plus-08-2024-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-plus-08-2024
2d8cf3ab0af78b9e43546486b096f86adf3ba4d0
33.584534
cc-by-nc-4.0
177
103
true
true
true
false
true
22.318877
0.753954
75.395395
0.5996
42.836865
0.120091
12.009063
0.350671
13.422819
0.482948
19.835156
0.442071
38.007905
true
2024-08-21
2024-09-19
0
CohereForAI/c4ai-command-r-plus-08-2024
CohereForAI_c4ai-command-r-v01_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
CohereForCausalLM
<a target="_blank" href="https://huggingface.co/CohereForAI/c4ai-command-r-v01" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CohereForAI/c4ai-command-r-v01</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CohereForAI__c4ai-command-r-v01-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CohereForAI/c4ai-command-r-v01
16881ccde1c68bbc7041280e6a66637bc46bfe88
25.349978
cc-by-nc-4.0
1,066
34
true
true
true
false
true
13.395437
0.674819
67.481948
0.540642
34.556659
0
0
0.307047
7.606264
0.451698
16.128906
0.336935
26.326093
true
2024-03-11
2024-06-13
0
CohereForAI/c4ai-command-r-v01
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.483995
0
2
false
true
true
false
true
0.979648
0.327831
32.783127
0.391996
14.585976
0.043051
4.305136
0.249161
0
0.41201
9.834635
0.166556
7.395095
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
a5f780075831374f8850324448acf94976dea504
11.1488
0
2
false
true
true
false
true
0.994569
0.310246
31.02457
0.388103
14.243046
0.046828
4.682779
0.253356
0.447427
0.408073
9.109115
0.166473
7.38586
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-dpo-v1.0
Columbia-NLP_LION-Gemma-2b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
090d9f59c3b47ab8dd099ddd278c058aa6d2d529
11.456795
4
2
false
true
true
false
true
0.962068
0.306649
30.664858
0.389584
14.023922
0.043051
4.305136
0.24245
0
0.427917
12.05625
0.169215
7.690603
false
2024-06-28
2024-07-13
0
Columbia-NLP/LION-Gemma-2b-odpo-v1.0
Columbia-NLP_LION-Gemma-2b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
GemmaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-Gemma-2b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-Gemma-2b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-Gemma-2b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-Gemma-2b-sft-v1.0
44d6f26fa7e3b0d238064d844569bf8a07b7515e
12.489957
0
2
false
true
true
false
true
0.960809
0.369247
36.924693
0.387878
14.117171
0.061178
6.117825
0.255872
0.782998
0.40274
8.309115
0.178191
8.687943
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-Gemma-2b-sft-v1.0
Columbia-NLP_LION-LLaMA-3-8b-dpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-dpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
3cddd4a6f5939a0a4db1092a0275342b7b9912f3
21.470701
2
8
false
true
true
false
true
0.696849
0.495742
49.574241
0.502848
30.356399
0.098187
9.818731
0.28104
4.138702
0.409719
10.28151
0.321892
24.654625
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-dpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-odpo-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-odpo-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
e2cec0d68a67092951e9205dfe634a59f2f4a2dd
19.462976
2
8
false
true
true
false
true
0.718697
0.396799
39.679938
0.502393
30.457173
0.083082
8.308157
0.285235
4.697987
0.40575
9.71875
0.315243
23.915854
false
2024-06-28
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-odpo-v1.0
Columbia-NLP_LION-LLaMA-3-8b-sft-v1.0_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Columbia-NLP__LION-LLaMA-3-8b-sft-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
822eddb2fd127178d9fb7bb9f4fca0e93ada2836
20.459336
0
8
false
true
true
false
true
0.753613
0.381712
38.171164
0.508777
30.88426
0.096677
9.667674
0.277685
3.691275
0.450271
15.483854
0.32372
24.857787
false
2024-07-02
2024-07-04
0
Columbia-NLP/LION-LLaMA-3-8b-sft-v1.0
CombinHorizon_Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES
52d6f6308eba9c3a0b9116706fbb1ddc448e6101
27.14669
apache-2.0
1
7
true
false
true
false
true
1.045561
0.756402
75.64019
0.540209
34.95407
0
0
0.297819
6.375839
0.403302
8.779427
0.434176
37.130615
false
2024-10-29
2024-10-29
1
CombinHorizon/Rombos-Qwen2.5-7B-Inst-BaseMerge-TIES (Merge)
CombinHorizon_YiSM-blossom5.1-34B-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CombinHorizon/YiSM-blossom5.1-34B-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CombinHorizon/YiSM-blossom5.1-34B-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CombinHorizon__YiSM-blossom5.1-34B-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CombinHorizon/YiSM-blossom5.1-34B-SLERP
ebd8d6507623008567a0548cd0ff9e28cbd6a656
31.392518
apache-2.0
0
34
true
false
true
false
true
3.070814
0.503311
50.331121
0.620755
46.397613
0.216012
21.601208
0.355705
14.09396
0.441344
14.367969
0.474069
41.563239
false
2024-08-27
2024-08-27
1
CombinHorizon/YiSM-blossom5.1-34B-SLERP (Merge)
CoolSpring_Qwen2-0.5B-Abyme_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme
a48b7c04b854e5c60fe3464f96904bfc53c8640c
4.798584
apache-2.0
0
0
true
true
true
false
true
1.177797
0.191519
19.15185
0.286183
2.276484
0.017372
1.73716
0.253356
0.447427
0.354219
1.477344
0.133311
3.701241
false
2024-07-18
2024-09-04
1
Qwen/Qwen2-0.5B
CoolSpring_Qwen2-0.5B-Abyme-merge2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge2
02c4c601453f7ecbfab5c95bf5afa889350026ba
6.118848
apache-2.0
0
0
true
false
true
false
true
0.609695
0.202185
20.218465
0.299427
3.709041
0.021148
2.114804
0.260067
1.342282
0.368729
3.891146
0.148936
5.437352
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge2 (Merge)
CoolSpring_Qwen2-0.5B-Abyme-merge3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CoolSpring/Qwen2-0.5B-Abyme-merge3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CoolSpring/Qwen2-0.5B-Abyme-merge3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CoolSpring__Qwen2-0.5B-Abyme-merge3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CoolSpring/Qwen2-0.5B-Abyme-merge3
86fed893893cc2a6240f0ea09ce2eeda1a5178cc
6.706903
apache-2.0
0
0
true
false
true
false
true
0.610171
0.238605
23.860468
0.300314
4.301149
0.024924
2.492447
0.264262
1.901566
0.350094
2.128385
0.150017
5.557402
false
2024-07-27
2024-07-27
1
CoolSpring/Qwen2-0.5B-Abyme-merge3 (Merge)
Corianas_llama-3-reactor_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Corianas/llama-3-reactor" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Corianas/llama-3-reactor</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Corianas__llama-3-reactor-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Corianas/llama-3-reactor
bef2eac42fd89baa0064badbc9c7958ad9ccbed3
14.020646
apache-2.0
0
-1
true
true
true
false
false
0.821165
0.230012
23.001192
0.445715
21.88856
0.048338
4.833837
0.297819
6.375839
0.397719
8.014844
0.280086
20.009604
false
2024-07-20
2024-07-23
0
Corianas/llama-3-reactor
CortexLM_btlm-7b-base-v0.2_bfloat16
bfloat16
🟢 pretrained
🟢
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/CortexLM/btlm-7b-base-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CortexLM/btlm-7b-base-v0.2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CortexLM__btlm-7b-base-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CortexLM/btlm-7b-base-v0.2
eda8b4298365a26c8981316e09427c237b11217f
8.869902
mit
1
6
true
true
true
false
false
0.711358
0.148329
14.832866
0.400641
16.193277
0.012085
1.208459
0.253356
0.447427
0.384604
5.542188
0.234957
14.995198
false
2024-06-13
2024-06-26
0
CortexLM/btlm-7b-base-v0.2
Cran-May_T.E-8.1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Cran-May/T.E-8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Cran-May/T.E-8.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Cran-May__T.E-8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Cran-May/T.E-8.1
5f84709710dcce7cc05fa12473e8bb207fe25849
29.405457
cc-by-nc-sa-4.0
3
7
true
true
true
false
true
1.090633
0.707692
70.769226
0.558175
37.024377
0.067976
6.797583
0.312919
8.389262
0.450521
15.315104
0.443235
38.13719
false
2024-09-27
2024-09-29
1
Cran-May/T.E-8.1 (Merge)
CultriX_Qwen2.5-14B-MegaMerge-pt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MegaMerge-pt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MegaMerge-pt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MegaMerge-pt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MegaMerge-pt2
20397f6cafc09c2cb74f105867cd99b3c68c71dc
36.694314
apache-2.0
2
14
true
false
true
false
false
2.250434
0.568308
56.830765
0.65777
50.907903
0.273414
27.34139
0.379195
17.225951
0.472875
18.742708
0.542055
49.117169
false
2024-10-24
2024-10-25
1
CultriX/Qwen2.5-14B-MegaMerge-pt2 (Merge)
CultriX_Qwen2.5-14B-MergeStock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-MergeStock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-MergeStock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-MergeStock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-MergeStock
fa00543296f2731793dfb0aac667571ccf1abb5b
36.390259
2
14
false
true
true
false
false
4.430606
0.568533
56.85326
0.657934
51.009391
0.273414
27.34139
0.373322
16.442953
0.467635
17.854427
0.539561
48.84013
false
2024-10-23
2024-10-24
1
CultriX/Qwen2.5-14B-MergeStock (Merge)
CultriX_Qwen2.5-14B-Wernicke_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke
622c0a58ecb0c0c679d7381a823d2ae5ac2b8ce1
36.999242
1
14
false
true
true
false
false
2.222234
0.52347
52.346995
0.656836
50.642876
0.324773
32.477341
0.393456
19.127517
0.468906
18.246615
0.542387
49.154108
false
2024-10-21
2024-10-22
1
CultriX/Qwen2.5-14B-Wernicke (Merge)
CultriX_Qwen2.5-14B-Wernicke-SLERP_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/CultriX/Qwen2.5-14B-Wernicke-SLERP" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">CultriX/Qwen2.5-14B-Wernicke-SLERP</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/CultriX__Qwen2.5-14B-Wernicke-SLERP-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
CultriX/Qwen2.5-14B-Wernicke-SLERP
180175561e8061be067fc349ad4491270f19976f
30.639825
0
14
false
true
true
false
true
2.155988
0.55889
55.889041
0.644093
49.372327
0.094411
9.441088
0.34396
12.527964
0.414031
11.120573
0.509392
45.487958
false
2024-10-25
0
Removed
DUAL-GPO_zephyr-7b-ipo-0k-15k-i1_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Adapter
?
<a target="_blank" href="https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-0k-15k-i1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DUAL-GPO/zephyr-7b-ipo-0k-15k-i1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DUAL-GPO__zephyr-7b-ipo-0k-15k-i1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DUAL-GPO/zephyr-7b-ipo-0k-15k-i1
564d269c67dfcc5c07a4fbc270a6a48da1929d30
15.492948
0
14
false
true
true
false
false
0.971423
0.275624
27.562423
0.447271
22.658643
0.030211
3.021148
0.291107
5.480984
0.417344
10.567969
0.312999
23.666519
false
2024-09-20
2024-09-22
1
DUAL-GPO/zephyr-7b-ipo-qlora-v0-merged
DZgas_GIGABATEMAN-7B_float16
float16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DZgas/GIGABATEMAN-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DZgas/GIGABATEMAN-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DZgas__GIGABATEMAN-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DZgas/GIGABATEMAN-7B
edf2840350e7fd55895d9df560b489ac10ecb95e
20.446293
5
7
false
true
true
false
false
0.630337
0.460746
46.074638
0.503218
29.827517
0.053625
5.362538
0.28943
5.257271
0.432844
11.972135
0.317653
24.183658
false
2024-04-17
2024-09-15
1
DZgas/GIGABATEMAN-7B (Merge)
Dampfinchen_Llama-3.1-8B-Ultra-Instruct_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dampfinchen/Llama-3.1-8B-Ultra-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dampfinchen/Llama-3.1-8B-Ultra-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dampfinchen__Llama-3.1-8B-Ultra-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dampfinchen/Llama-3.1-8B-Ultra-Instruct
46662d14130cfd34f7d90816540794f24a301f86
29.127051
llama3
7
8
true
false
true
false
true
0.836479
0.808109
80.810915
0.525753
32.494587
0.15861
15.861027
0.291946
5.592841
0.400323
8.607031
0.382563
31.395907
false
2024-08-26
2024-08-26
1
Dampfinchen/Llama-3.1-8B-Ultra-Instruct (Merge)
Danielbrdz_Barcenas-14b-Phi-3-medium-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-14b-Phi-3-medium-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
b749dbcb19901b8fd0e9f38c923a24533569f895
31.738448
mit
5
13
true
true
true
false
true
1.572315
0.479906
47.990554
0.653618
51.029418
0.193353
19.335347
0.326342
10.178971
0.48075
20.527083
0.472324
41.369311
false
2024-06-15
2024-08-13
0
Danielbrdz/Barcenas-14b-Phi-3-medium-ORPO
Danielbrdz_Barcenas-Llama3-8b-ORPO_float16
float16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Danielbrdz/Barcenas-Llama3-8b-ORPO" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Danielbrdz/Barcenas-Llama3-8b-ORPO</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Danielbrdz__Barcenas-Llama3-8b-ORPO-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Danielbrdz/Barcenas-Llama3-8b-ORPO
66c848c4526d3db1ec41468c0f73ac4448c6abe9
26.519005
other
7
8
true
true
true
false
true
0.774159
0.737243
73.724274
0.498656
28.600623
0.06571
6.570997
0.307047
7.606264
0.418958
11.169792
0.382979
31.44208
false
2024-04-29
2024-06-29
0
Danielbrdz/Barcenas-Llama3-8b-ORPO
Dans-DiscountModels_Dans-Instruct-CoreCurriculum-12b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-CoreCurriculum-12b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-CoreCurriculum-12b-ChatML
56925fafe6a543e224db36864dd0927171542776
12.913452
apache-2.0
0
12
true
true
true
false
false
3.234644
0.211102
21.11021
0.479186
26.046417
0.005287
0.528701
0.280201
4.026846
0.360635
5.71276
0.280502
20.055777
false
2024-09-04
2024-09-04
1
mistralai/Mistral-Nemo-Base-2407
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML
029d84d4f4a618aa798490c046753b12801158e2
13.49618
apache-2.0
0
8
true
true
true
false
false
0.798569
0.082508
8.250775
0.473817
26.336394
0.053625
5.362538
0.294463
5.928412
0.391823
9.677865
0.32879
25.421099
false
2024-09-09
2024-09-14
1
Dans-DiscountModels/Meta-Llama-3.1-8B-ChatML
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.0
9367c1273b0025793531fcf3a2c15416539f5d81
12.974202
apache-2.0
0
8
true
true
true
false
false
0.814699
0.06682
6.682048
0.477477
26.737652
0.061178
6.117825
0.286074
4.809843
0.378583
8.122917
0.328374
25.374926
false
2024-09-20
2024-09-20
1
Dans-DiscountModels/Meta-Llama-3.1-8B-ChatML
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.1.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.1.1
a6188cd1807d0d72e55adc371ddd198d7e9aa7ae
13.311583
apache-2.0
0
8
true
true
true
false
false
0.790589
0.091051
9.105063
0.474865
26.412551
0.057402
5.740181
0.291107
5.480984
0.38249
7.811198
0.327876
25.319518
false
2024-09-22
2024-09-23
1
Dans-DiscountModels/Meta-Llama-3.1-8B-ChatML
Dans-DiscountModels_Dans-Instruct-Mix-8b-ChatML-V0.2.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Dans-DiscountModels__Dans-Instruct-Mix-8b-ChatML-V0.2.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Dans-DiscountModels/Dans-Instruct-Mix-8b-ChatML-V0.2.0
15a9988381fdba15281f1bd6b04c34f3f96120cc
18.590919
1
8
false
true
true
false
true
0.843717
0.506409
50.640855
0.462426
24.734771
0.043807
4.380665
0.293624
5.816555
0.364448
3.75599
0.29995
22.216681
false
2024-09-30
2024-09-30
1
Dans-DiscountModels/Meta-Llama-3.1-8B-ChatML
Darkknight535_OpenCrystal-12B-L3_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Darkknight535/OpenCrystal-12B-L3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Darkknight535/OpenCrystal-12B-L3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Darkknight535__OpenCrystal-12B-L3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Darkknight535/OpenCrystal-12B-L3
974d2d453afdde40f6a993601bbbbf9d97b43606
20.672888
14
11
false
true
true
false
false
2.012285
0.407091
40.709096
0.52226
31.844491
0.089124
8.912387
0.306208
7.494407
0.365656
5.740365
0.364029
29.336584
false
2024-08-25
2024-08-26
0
Darkknight535/OpenCrystal-12B-L3
DavidAU_L3-Dark-Planet-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Dark-Planet-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Dark-Planet-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Dark-Planet-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Dark-Planet-8B
462c9307ba4cfcb0c1edcceac5e06f4007bc803e
20.519537
5
8
false
true
true
false
false
0.939141
0.413411
41.341086
0.508408
29.789627
0.085347
8.534743
0.300336
6.711409
0.361594
6.332552
0.37367
30.407801
false
2024-09-05
2024-09-12
1
DavidAU/L3-Dark-Planet-8B (Merge)
DavidAU_L3-Jamet-12.2B-MK.V-Blackroot-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Jamet-12.2B-MK.V-Blackroot-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct
db4ae3d7b608fd0e7490d2fcfa0436e56e21af33
17.857043
0
12
false
true
true
false
false
1.437522
0.3962
39.619986
0.476572
25.869793
0.040785
4.07855
0.278523
3.803132
0.401969
8.31276
0.329122
25.458038
false
2024-08-23
2024-09-04
1
DavidAU/L3-Jamet-12.2B-MK.V-Blackroot-Instruct (Merge)
DavidAU_L3-Lumimaid-12.2B-v0.1-OAS-Instruct_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Lumimaid-12.2B-v0.1-OAS-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct
65a9e957dc4211aa3d87fdf588767823af5cde3f
17.743439
1
12
false
true
true
false
false
1.424707
0.392403
39.240327
0.469302
24.504816
0.040785
4.07855
0.276846
3.579418
0.419427
11.261719
0.314162
23.795804
false
2024-08-24
2024-09-12
1
DavidAU/L3-Lumimaid-12.2B-v0.1-OAS-Instruct (Merge)
DavidAU_L3-SMB-Instruct-12.2B-F32_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-SMB-Instruct-12.2B-F32" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-SMB-Instruct-12.2B-F32</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-SMB-Instruct-12.2B-F32-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-SMB-Instruct-12.2B-F32
ac5e205a41b17a7b05b1b62f352aacc7e65b2f13
18.863875
1
12
false
true
true
false
false
1.382397
0.430322
43.032155
0.478641
26.130957
0.044562
4.456193
0.281879
4.250559
0.408729
9.624479
0.3312
25.688904
false
2024-08-25
2024-09-12
1
DavidAU/L3-SMB-Instruct-12.2B-F32 (Merge)
DavidAU_L3-Stheno-Maid-Blackroot-Grand-HORROR-16B_float16
float16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B
7b626e50b6c35fcb064b8b039fcf30eae01c3fae
17.096786
0
16
false
true
true
false
false
2.922799
0.343893
34.389309
0.473633
26.692021
0.015861
1.586103
0.270973
2.796421
0.403115
8.55599
0.357048
28.560875
false
2024-08-23
2024-09-04
1
DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B (Merge)
DavidAU_L3-Stheno-v3.2-12.2B-Instruct_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DavidAU/L3-Stheno-v3.2-12.2B-Instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DavidAU/L3-Stheno-v3.2-12.2B-Instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DavidAU__L3-Stheno-v3.2-12.2B-Instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DavidAU/L3-Stheno-v3.2-12.2B-Instruct
8271fc32a601a4fa5efbe58c41a0ef4181ad8836
18.790033
1
12
false
true
true
false
false
1.3977
0.402795
40.279459
0.484598
27.369623
0.053625
5.362538
0.275168
3.355705
0.41025
10.314583
0.334525
26.058289
false
2024-08-24
2024-09-12
1
DavidAU/L3-Stheno-v3.2-12.2B-Instruct (Merge)
Deci_DeciLM-7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B
c3c9f4226801dc0433f32aebffe0aac68ee2f051
14.960537
apache-2.0
224
7
true
true
true
false
false
0.642137
0.281295
28.129474
0.442286
21.25273
0.024924
2.492447
0.295302
6.040268
0.435854
13.048438
0.269199
18.799867
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B
Deci_DeciLM-7B-instruct_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
DeciLMForCausalLM
<a target="_blank" href="https://huggingface.co/Deci/DeciLM-7B-instruct" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Deci/DeciLM-7B-instruct</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Deci__DeciLM-7B-instruct-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Deci/DeciLM-7B-instruct
4adc7aa9efe61b47b0a98b2cc94527d9c45c3b4f
17.457504
apache-2.0
96
7
true
true
true
false
true
0.638649
0.488024
48.8024
0.458975
23.887149
0.029456
2.945619
0.28943
5.257271
0.388417
5.985417
0.260805
17.867169
true
2023-12-10
2024-06-12
0
Deci/DeciLM-7B-instruct
DeepAutoAI_Explore_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.1-8B-Inst
9752180fafd8f584625eb649c0cba36b91bdc3ce
28.788231
apache-2.0
0
8
true
true
true
false
true
1.750239
0.779483
77.948288
0.511742
30.393263
0.192598
19.259819
0.283557
4.474273
0.390958
9.636458
0.379156
31.017287
false
2024-09-21
2024-10-09
1
DeepAutoAI/Explore_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst
9fd790df246b8979c02173f7698819a7805fb04e
13.708555
apache-2.0
0
1
true
true
true
false
true
0.891846
0.564886
56.488561
0.350481
8.292274
0.063444
6.344411
0.255872
0.782998
0.318344
1.359635
0.180851
8.983452
false
2024-10-07
2024-10-09
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
9509dee6b01fff1a11dc26cf58d7eecbe3d9d9c4
13.182851
1
1
false
true
true
false
true
0.467189
0.559715
55.971489
0.336509
7.042772
0.049094
4.909366
0.263423
1.789709
0.310313
0.455729
0.180352
8.928044
false
2024-10-08
2024-10-08
0
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v0
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1
3f8b0fb6dcc1e9725ba52dd086241d5d9e413100
10.619319
apache-2.0
0
1
true
true
true
false
true
0.469966
0.499889
49.988918
0.314148
4.25778
0.01284
1.283988
0.244966
0
0.378094
5.195052
0.126912
2.990174
false
2024-10-08
2024-10-08
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1 (Merge)
DeepAutoAI_Explore_Llama-3.2-1B-Inst_v1.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__Explore_Llama-3.2-1B-Inst_v1.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1
158b977bca89e073871e2313740a7c75eb1291af
14.211124
apache-2.0
0
1
true
true
true
false
true
0.912912
0.584419
58.441934
0.351266
8.818154
0.06571
6.570997
0.262584
1.677852
0.311708
0.663542
0.181848
9.094267
false
2024-10-09
2024-10-17
1
DeepAutoAI/Explore_Llama-3.2-1B-Inst_v1.1 (Merge)
DeepAutoAI_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/causal_gpt2
995f029f6645dde1ef830406001754b904c49775
5.981707
0
0
false
true
true
false
false
0.125865
0.181277
18.127679
0.302571
2.633344
0.002266
0.226586
0.260067
1.342282
0.426958
12.103125
0.113115
1.457225
false
2024-10-17
2024-10-17
0
DeepAutoAI/causal_gpt2
DeepAutoAI_d2nwg_Llama-3.1-8B-Instruct-v0.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_Llama-3.1-8B-Instruct-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0
8bad8800d04a06f3f906728ee223cab2f50453a0
27.727687
0
8
false
true
true
false
true
0.856178
0.789275
78.927468
0.508041
30.510076
0.083837
8.383686
0.291946
5.592841
0.413469
10.983594
0.387716
31.968454
false
2024-09-10
2024-09-10
0
DeepAutoAI/d2nwg_Llama-3.1-8B-Instruct-v0.0
DeepAutoAI_d2nwg_causal_gpt2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2
eab065cba5a7a9b08f8b264d61d504c4ecbb611b
6.292853
0
0
false
true
true
false
false
0.129907
0.191618
19.161824
0.30269
2.850574
0.003776
0.377644
0.25755
1.006711
0.429719
12.68151
0.11511
1.678856
false
2024-10-18
2024-10-18
0
DeepAutoAI/d2nwg_causal_gpt2
DeepAutoAI_d2nwg_causal_gpt2_v1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
GPT2LMHeadModel
<a target="_blank" href="https://huggingface.co/DeepAutoAI/d2nwg_causal_gpt2_v1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/d2nwg_causal_gpt2_v1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__d2nwg_causal_gpt2_v1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/d2nwg_causal_gpt2_v1
3f40c3dcb3eb591dec80ff03573eec7928a7feaa
6.381801
0
0
false
true
true
false
false
0.230406
0.198862
19.886235
0.29919
2.387278
0.001511
0.151057
0.258389
1.118568
0.433688
13.244271
0.113531
1.503398
false
2024-10-18
2024-10-19
0
DeepAutoAI/d2nwg_causal_gpt2_v1
DeepAutoAI_ldm_soup_Llama-3.1-8B-Inst_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Inst-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst
0f04c5ad830f8ae0828191a4670fd4ba361b63d2
28.763892
apache-2.0
3
8
true
true
true
false
true
1.704023
0.803263
80.326312
0.512117
31.101628
0.123112
12.311178
0.28943
5.257271
0.416135
11.516927
0.38863
32.070035
false
2024-09-16
2024-10-09
1
DeepAutoAI/ldm_soup_Llama-3.1-8B-Inst (Merge)
DeepAutoAI_ldm_soup_Llama-3.1-8B-Instruct-v0.0_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Instruct-v0.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0
210a97b4dadbda63cc9fe459e8415d4cd3bbaf99
28.375728
0
8
false
true
true
false
true
0.860455
0.78895
78.894999
0.512518
31.162649
0.110272
11.02719
0.291107
5.480984
0.412135
11.516927
0.389545
32.171616
false
2024-09-14
2024-09-15
0
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.0
DeepAutoAI_ldm_soup_Llama-3.1-8B-Instruct-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepAutoAI__ldm_soup_Llama-3.1-8B-Instruct-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1
ecd140c95985b4292c896e25a94a7629d2924ad1
28.375728
0
8
false
true
true
false
true
0.828446
0.78895
78.894999
0.512518
31.162649
0.110272
11.02719
0.291107
5.480984
0.412135
11.516927
0.389545
32.171616
false
2024-09-15
2024-09-16
0
DeepAutoAI/ldm_soup_Llama-3.1-8B-Instruct-v0.1
DeepMount00_Lexora-Lite-3B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Lite-3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Lite-3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Lite-3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Lite-3B
2cf39db7ecac17edca0bf4e0973b7fb58c40c22c
21.618583
0
3
false
true
true
false
true
2.2935
0.572104
57.210424
0.479713
27.199848
0.053625
5.362538
0.280201
4.026846
0.395552
7.877344
0.352311
28.034501
false
2024-09-19
2024-10-20
0
DeepMount00/Lexora-Lite-3B
DeepMount00_Lexora-Medium-7B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Lexora-Medium-7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Lexora-Medium-7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Lexora-Medium-7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Lexora-Medium-7B
c53d166f4f2996a5b7f161529f1ea6548b54a2b2
24.653915
apache-2.0
5
7
true
true
true
false
true
1.734911
0.410338
41.03379
0.514484
32.695331
0.151057
15.10574
0.305369
7.38255
0.443948
14.760156
0.432513
36.945922
false
2024-09-24
2024-09-24
0
DeepMount00/Lexora-Medium-7B
DeepMount00_Llama-3-8b-Ita_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3-8b-Ita" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3-8b-Ita</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3-8b-Ita-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3-8b-Ita
d40847d2981b588690c1dc21d5157d3f4afb2978
26.733876
llama3
23
8
true
true
true
false
true
0.778258
0.75303
75.302974
0.493577
28.077746
0.062689
6.268882
0.305369
7.38255
0.426771
11.679688
0.385223
31.691415
false
2024-05-01
2024-06-27
1
meta-llama/Meta-Llama-3-8B
DeepMount00_Llama-3.1-8b-ITA_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-8b-ITA" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-8b-ITA</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-8b-ITA-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-8b-ITA
5ede1e388b6b15bc06acd364a8f805fe9ed16db9
28.228098
5
8
false
true
true
false
true
2.507574
0.791673
79.167276
0.510936
30.933181
0.108761
10.876133
0.287752
5.033557
0.413594
11.399219
0.387633
31.95922
false
2024-08-13
2024-10-28
2
meta-llama/Meta-Llama-3.1-8B
DeepMount00_Llama-3.1-Distilled_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Llama-3.1-Distilled" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Llama-3.1-Distilled</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Llama-3.1-Distilled-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Llama-3.1-Distilled
0a94c7ddb196107e8bf1b02e31488ff8c17b9eb3
28.838347
llama3
0
8
true
true
true
false
true
0.839
0.784379
78.437878
0.510088
30.841421
0.155589
15.558912
0.303691
7.158837
0.405812
10.126562
0.378158
30.906472
false
2024-10-25
2024-10-25
1
meta-llama/Meta-Llama-3-8B
DeepMount00_Qwen2.5-7B-Instruct-MathCoder_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/Qwen2.5-7B-Instruct-MathCoder" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/Qwen2.5-7B-Instruct-MathCoder</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__Qwen2.5-7B-Instruct-MathCoder-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/Qwen2.5-7B-Instruct-MathCoder
90df996cdb1f3d5f051513c50df4cdfda858b5f2
4.384322
0
7
false
true
true
false
true
1.29268
0.153025
15.302508
0.299844
2.636671
0
0
0.262584
1.677852
0.380635
5.379427
0.111785
1.309471
false
2024-10-24
0
Removed
DeepMount00_mergekit-ties-okvgjfz_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/DeepMount00/mergekit-ties-okvgjfz" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DeepMount00/mergekit-ties-okvgjfz</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DeepMount00__mergekit-ties-okvgjfz-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DeepMount00/mergekit-ties-okvgjfz
90df996cdb1f3d5f051513c50df4cdfda858b5f2
4.384322
0
7
false
true
true
false
true
1.288821
0.153025
15.302508
0.299844
2.636671
0
0
0.262584
1.677852
0.380635
5.379427
0.111785
1.309471
false
2024-10-24
0
Removed
Delta-Vector_Baldur-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Baldur-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Baldur-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Baldur-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Baldur-8B
97f5d321a8346551a5ed704997dd1e93c59883f3
24.128795
agpl-3.0
3
8
true
true
true
false
false
2.294232
0.478182
47.818233
0.530584
32.541834
0.139728
13.97281
0.302013
6.935123
0.437156
14.011198
0.365442
29.493573
false
2024-09-23
2024-10-06
1
Delta-Vector/Baldur-8B (Merge)
Delta-Vector_Darkens-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Darkens-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Darkens-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Darkens-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Darkens-8B
e82be0389bfcecd1998dba1c3bb35b8d95d01bf2
18.874475
agpl-3.0
4
8
true
true
true
false
false
1.199743
0.254766
25.476624
0.525059
32.883795
0.055136
5.513595
0.324664
9.955257
0.410552
9.01901
0.373587
30.398567
false
2024-09-22
2024-10-06
1
Delta-Vector/Darkens-8B (Merge)
Delta-Vector_Henbane-7b-attempt2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Henbane-7b-attempt2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Henbane-7b-attempt2</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Henbane-7b-attempt2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Henbane-7b-attempt2
448ef54e5af03e13f16f3db8ad8d1481479ac12e
23.801362
apache-2.0
0
7
true
true
true
false
true
1.133838
0.415734
41.573359
0.506118
30.865849
0.226586
22.65861
0.290268
5.369128
0.397344
8.701302
0.402759
33.639923
false
2024-09-13
2024-10-11
1
Qwen/Qwen2-7B
Delta-Vector_Odin-9B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Odin-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Odin-9B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Odin-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Odin-9B
9ff20f5dd427e751ada834319bfdd9ea60b5e89c
24.914172
agpl-3.0
3
9
true
true
true
false
false
2.708162
0.369197
36.919706
0.544025
34.832423
0.141239
14.123867
0.341443
12.192394
0.464781
17.564323
0.404671
33.85232
false
2024-09-27
2024-10-06
0
Delta-Vector/Odin-9B
Delta-Vector_Tor-8B_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/Delta-Vector/Tor-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Delta-Vector/Tor-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Delta-Vector__Tor-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Delta-Vector/Tor-8B
d30a7a121c2ef5dc14004cfdf3fd13208dfbdb4f
18.419467
agpl-3.0
2
8
true
true
true
false
false
1.252053
0.238155
23.815476
0.520911
31.738224
0.059668
5.966767
0.323826
9.8434
0.409219
8.81901
0.373005
30.333924
false
2024-09-21
2024-10-06
1
Delta-Vector/Tor-8B (Merge)
DreadPoor_Aspire-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire-8B-model_stock
5c23cb2aff877d0b7bdcfa4de43d1bc8a1852de0
28.523165
apache-2.0
3
8
true
false
true
false
true
0.843128
0.714062
71.406202
0.527825
32.53427
0.14426
14.425982
0.314597
8.612975
0.42125
13.45625
0.37633
30.70331
false
2024-09-16
2024-09-17
1
DreadPoor/Aspire-8B-model_stock (Merge)
DreadPoor_Aspire_1.3-8B_model-stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aspire_1.3-8B_model-stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aspire_1.3-8B_model-stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aspire_1.3-8B_model-stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aspire_1.3-8B_model-stock
d36f5540e8c5654a9fdd8ece9ba8e88af26e5c40
28.388802
apache-2.0
0
8
true
false
true
false
true
0.715781
0.706169
70.616852
0.530164
32.661851
0.169184
16.918429
0.307886
7.718121
0.410458
12.240625
0.371592
30.176936
false
2024-11-01
2024-11-01
1
DreadPoor/Aspire_1.3-8B_model-stock (Merge)
DreadPoor_Aurora_faustus-8B-LINEAR_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aurora_faustus-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aurora_faustus-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aurora_faustus-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aurora_faustus-8B-LINEAR
76acf1ac703eb827d2541d07a8d4a7cba4b731d4
29.569556
apache-2.0
1
8
true
false
true
false
true
0.767377
0.7281
72.810033
0.551554
36.263482
0.167674
16.767372
0.307047
7.606264
0.414583
12.389583
0.384225
31.5806
false
2024-09-25
2024-09-26
1
DreadPoor/Aurora_faustus-8B-LINEAR (Merge)
DreadPoor_Aurora_faustus-8B-LORABLATED_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aurora_faustus-8B-LORABLATED" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aurora_faustus-8B-LORABLATED</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aurora_faustus-8B-LORABLATED-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aurora_faustus-8B-LORABLATED
97746081f7c681dcf7fad10c57de9a341aa10db1
29.064263
apache-2.0
1
8
true
false
true
false
true
0.80085
0.752705
75.270504
0.53916
34.199935
0.145015
14.501511
0.302013
6.935123
0.423854
13.781771
0.367271
29.696735
false
2024-09-29
2024-09-29
1
DreadPoor/Aurora_faustus-8B-LORABLATED (Merge)
DreadPoor_Aurora_faustus-8B-LORABLATED_ALT_float16
float16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Aurora_faustus-8B-LORABLATED_ALT" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Aurora_faustus-8B-LORABLATED_ALT</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Aurora_faustus-8B-LORABLATED_ALT-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Aurora_faustus-8B-LORABLATED_ALT
3ca36587d26bfd936aa1358adc1eabf377aa1e98
28.934153
apache-2.0
1
8
true
false
true
false
true
0.795116
0.737792
73.779239
0.538767
34.21152
0.154079
15.407855
0.298658
6.487696
0.422521
13.781771
0.369432
29.936835
false
2024-09-29
2024-09-29
1
DreadPoor/Aurora_faustus-8B-LORABLATED_ALT (Merge)
DreadPoor_BaeZel-8B-LINEAR_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/BaeZel-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/BaeZel-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__BaeZel-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/BaeZel-8B-LINEAR
1deac3287de191794c50543d69d523f43654a803
30.296459
apache-2.0
0
8
true
false
true
false
true
0.665069
0.737792
73.779239
0.54638
35.535376
0.178248
17.824773
0.321309
9.50783
0.422708
13.338542
0.386137
31.792996
false
2024-11-08
2024-11-08
1
DreadPoor/BaeZel-8B-LINEAR (Merge)
DreadPoor_Damasteel-8B-LINEAR_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Damasteel-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Damasteel-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Damasteel-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Damasteel-8B-LINEAR
cfc389c15e614b14f1d8d16740dcc183047b435a
28.964891
apache-2.0
0
8
true
false
true
false
true
0.674659
0.738442
73.844178
0.538814
34.106138
0.166163
16.616314
0.298658
6.487696
0.42125
11.85625
0.377909
30.878768
false
2024-11-01
2024-11-01
1
DreadPoor/Damasteel-8B-LINEAR (Merge)
DreadPoor_Emu_Eggs-9B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Emu_Eggs-9B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Emu_Eggs-9B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Emu_Eggs-9B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Emu_Eggs-9B-Model_Stock
3fb1b2da72f3618f6943aedfd1600df27886792a
29.611715
apache-2.0
2
9
true
false
true
false
true
3.08835
0.760698
76.069828
0.605166
42.783674
0.02568
2.567976
0.333054
11.073826
0.407083
9.31875
0.422706
35.856235
false
2024-10-18
2024-10-18
0
DreadPoor/Emu_Eggs-9B-Model_Stock
DreadPoor_Eunoia_Vespera-8B-LINEAR_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Eunoia_Vespera-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Eunoia_Vespera-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Eunoia_Vespera-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Eunoia_Vespera-8B-LINEAR
c674956327af664735cf39b20c7a8276dfa579f9
28.931156
apache-2.0
2
8
true
false
true
false
true
0.81326
0.723529
72.352912
0.539931
34.216103
0.152568
15.256798
0.307047
7.606264
0.41849
12.611198
0.383893
31.543661
false
2024-09-22
2024-09-22
1
DreadPoor/Eunoia_Vespera-8B-LINEAR (Merge)
DreadPoor_Heart_Stolen-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Heart_Stolen-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Heart_Stolen-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Heart_Stolen-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Heart_Stolen-8B-Model_Stock
6d77987af7115c7455ddb072c48316815b018999
29.24739
apache-2.0
5
8
true
false
true
false
true
0.749301
0.724453
72.445334
0.539544
34.444822
0.162387
16.238671
0.317114
8.948546
0.416229
12.361979
0.379405
31.044991
false
2024-09-09
2024-09-10
1
DreadPoor/Heart_Stolen-8B-Model_Stock (Merge)
DreadPoor_Heart_Stolen-ALT-8B-Model_Stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Heart_Stolen-ALT-8B-Model_Stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Heart_Stolen-ALT-8B-Model_Stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Heart_Stolen-ALT-8B-Model_Stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Heart_Stolen-ALT-8B-Model_Stock
03d1d70cb7eb5a743468b97c9c580028df487564
27.754545
apache-2.0
2
8
true
false
true
false
true
0.735627
0.718358
71.83584
0.526338
32.354424
0.149547
14.954683
0.301174
6.823266
0.4055
9.754167
0.377244
30.804891
false
2024-09-11
2024-09-11
1
DreadPoor/Heart_Stolen-ALT-8B-Model_Stock (Merge)
DreadPoor_Irina-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Irina-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Irina-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Irina-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Irina-8B-model_stock
b282e3ab449d71a31f48b8c13eb43a4435968728
25.32468
0
8
false
true
true
false
true
0.745586
0.67994
67.994034
0.523664
32.08833
0.100453
10.045317
0.284396
4.58613
0.400292
8.636458
0.35738
28.597813
false
2024-08-30
0
Removed
DreadPoor_ONeil-model_stock-8B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/ONeil-model_stock-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/ONeil-model_stock-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__ONeil-model_stock-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/ONeil-model_stock-8B
d4b84956211fd57b85122fe0c6f88b2cb9a9c86a
26.935908
apache-2.0
2
8
true
false
true
false
true
0.764449
0.678566
67.85662
0.554834
36.412613
0.101208
10.120846
0.305369
7.38255
0.417344
10.967969
0.359874
28.874852
false
2024-07-06
2024-07-15
1
DreadPoor/ONeil-model_stock-8B (Merge)
DreadPoor_Promissum_Mane-8B-LINEAR_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Promissum_Mane-8B-LINEAR" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Promissum_Mane-8B-LINEAR</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Promissum_Mane-8B-LINEAR-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Promissum_Mane-8B-LINEAR
ff399e7004040e1807e8d08b4d0967206fc50afa
29.049297
1
8
false
true
true
false
true
0.828647
0.715036
71.50361
0.545768
35.25319
0.152568
15.256798
0.30453
7.270694
0.420042
13.338542
0.385057
31.672946
false
2024-09-30
2024-09-30
1
DreadPoor/Promissum_Mane-8B-LINEAR (Merge)
DreadPoor_Promissum_Mane-8B-LINEAR-lorablated_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Promissum_Mane-8B-LINEAR-lorablated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Promissum_Mane-8B-LINEAR-lorablated</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Promissum_Mane-8B-LINEAR-lorablated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Promissum_Mane-8B-LINEAR-lorablated
34c4a30b7462704810e35e033aa5ef33b075a97b
28.810739
0
8
false
true
true
false
true
0.792342
0.715636
71.563562
0.543518
34.609107
0.152568
15.256798
0.303691
7.158837
0.419792
13.840625
0.37392
30.435505
false
2024-09-30
2024-09-30
1
DreadPoor/Promissum_Mane-8B-LINEAR-lorablated (Merge)
DreadPoor_Sellen-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Sellen-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Sellen-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Sellen-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Sellen-8B-model_stock
accde7145d81a428c782695ea61eebc608efd980
26.362467
0
8
false
true
true
false
true
0.807471
0.711289
71.128938
0.523168
31.360979
0.132175
13.217523
0.274329
3.243848
0.396042
10.671875
0.356965
28.55164
false
2024-08-27
0
Removed
DreadPoor_Trinas_Nectar-8B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/Trinas_Nectar-8B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/Trinas_Nectar-8B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__Trinas_Nectar-8B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/Trinas_Nectar-8B-model_stock
cb46b8431872557904d83fc5aa1b90dabeb74acc
27.535042
apache-2.0
3
8
true
false
true
false
true
0.866724
0.725927
72.592721
0.525612
31.975094
0.153323
15.332326
0.286074
4.809843
0.406771
11.413021
0.361785
29.087249
false
2024-08-16
2024-08-27
1
DreadPoor/Trinas_Nectar-8B-model_stock (Merge)
DreadPoor_WIP_Damascus-8B-TIES_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/WIP_Damascus-8B-TIES" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/WIP_Damascus-8B-TIES</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__WIP_Damascus-8B-TIES-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/WIP_Damascus-8B-TIES
c7720a0b0a8d24e62bf71b0e955b1aca8e62f1cb
24.731381
apache-2.0
2
8
true
false
true
false
true
0.818112
0.477633
47.763268
0.541067
34.522306
0.151057
15.10574
0.307047
7.606264
0.411854
12.715104
0.37608
30.675606
false
2024-10-29
2024-10-29
1
DreadPoor/WIP_Damascus-8B-TIES (Merge)
DreadPoor_felix_dies-mistral-7B-model_stock_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
MistralForCausalLM
<a target="_blank" href="https://huggingface.co/DreadPoor/felix_dies-mistral-7B-model_stock" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">DreadPoor/felix_dies-mistral-7B-model_stock</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/DreadPoor__felix_dies-mistral-7B-model_stock-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
DreadPoor/felix_dies-mistral-7B-model_stock
bb317aa7565625327e18c5158aebd4710aa1d925
18.101828
0
7
false
true
true
false
false
0.661572
0.300779
30.07786
0.490092
28.890798
0.05136
5.135952
0.291946
5.592841
0.451823
15.477865
0.310921
23.435653
false
2024-09-30
0
Removed
EleutherAI_gpt-j-6b_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTJForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-j-6b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-j-6b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-j-6b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-j-6b
47e169305d2e8376be1d31e765533382721b2cc1
6.557824
apache-2.0
1,441
6
true
true
true
false
false
0.767432
0.252219
25.221856
0.319104
4.912818
0.01284
1.283988
0.245805
0
0.36575
5.252083
0.124086
2.676197
true
2022-03-02
2024-08-19
0
EleutherAI/gpt-j-6b
EleutherAI_gpt-neo-1.3B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-1.3B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-1.3B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-1.3B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-1.3B
dbe59a7f4a88d01d1ba9798d78dbe3fe038792c8
5.340738
mit
263
1
true
true
true
false
false
0.359424
0.207905
20.790503
0.303923
3.024569
0.007553
0.755287
0.255872
0.782998
0.381656
4.873698
0.116356
1.817376
true
2022-03-02
2024-06-12
0
EleutherAI/gpt-neo-1.3B
EleutherAI_gpt-neo-125m_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-125m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-125m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-125m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-125m
21def0189f5705e2521767faed922f1f15e7d7db
4.382146
mit
182
0
true
true
true
false
false
0.202902
0.190544
19.054442
0.311516
3.436739
0.004532
0.453172
0.253356
0.447427
0.359333
2.616667
0.10256
0.284427
true
2022-03-02
2024-08-10
0
EleutherAI/gpt-neo-125m
EleutherAI_gpt-neo-2.7B_bfloat16
bfloat16
🟢 pretrained
🟢
Original
GPTNeoForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neo-2.7B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neo-2.7B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neo-2.7B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neo-2.7B
e24fa291132763e59f4a5422741b424fb5d59056
6.355519
mit
438
2
true
true
true
false
false
0.508381
0.258963
25.896289
0.313952
4.178603
0.006042
0.60423
0.26594
2.12528
0.355365
3.520573
0.116273
1.808141
true
2022-03-02
2024-06-12
0
EleutherAI/gpt-neo-2.7B
EleutherAI_gpt-neox-20b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/gpt-neox-20b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/gpt-neox-20b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__gpt-neox-20b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/gpt-neox-20b
c292233c833e336628618a88a648727eb3dff0a7
6.003229
apache-2.0
534
20
true
true
true
false
false
3.146736
0.258688
25.868806
0.316504
4.929114
0.006798
0.679758
0.243289
0
0.364667
2.816667
0.115525
1.72503
true
2022-04-07
2024-06-09
0
EleutherAI/gpt-neox-20b
EleutherAI_pythia-12b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-12b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-12b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-12b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-12b
35c9d7f32fbb108fb8b5bdd574eb03369d1eed49
5.93396
apache-2.0
131
12
true
true
true
false
false
1.118007
0.247148
24.714757
0.317965
4.987531
0.009063
0.906344
0.246644
0
0.364698
3.78724
0.110871
1.20789
true
2023-02-28
2024-06-12
0
EleutherAI/pythia-12b
EleutherAI_pythia-160m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-160m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-160m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-160m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-160m
50f5173d932e8e61f858120bcb800b97af589f46
5.617102
apache-2.0
25
0
true
true
true
false
false
0.235339
0.181552
18.155162
0.297044
2.198832
0.002266
0.226586
0.258389
1.118568
0.417938
10.675521
0.111951
1.32794
true
2023-02-08
2024-06-09
0
EleutherAI/pythia-160m
EleutherAI_pythia-2.8b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-2.8b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-2.8b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-2.8b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-2.8b
2a259cdd96a4beb1cdf467512e3904197345f6a9
5.454241
apache-2.0
28
2
true
true
true
false
false
0.753902
0.217322
21.732226
0.322409
5.077786
0.007553
0.755287
0.25
0
0.348573
3.638281
0.113697
1.521868
true
2023-02-13
2024-06-12
0
EleutherAI/pythia-2.8b
EleutherAI_pythia-410m_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-410m" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-410m</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-410m-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-410m
9879c9b5f8bea9051dcb0e68dff21493d67e9d4f
5.113779
apache-2.0
21
0
true
true
true
false
false
0.377082
0.219545
21.954525
0.302813
2.715428
0.003021
0.302115
0.259228
1.230425
0.357813
3.059896
0.112783
1.420287
true
2023-02-13
2024-06-09
0
EleutherAI/pythia-410m
EleutherAI_pythia-6.9b_float16
float16
🟢 pretrained
🟢
Original
GPTNeoXForCausalLM
<a target="_blank" href="https://huggingface.co/EleutherAI/pythia-6.9b" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EleutherAI/pythia-6.9b</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EleutherAI__pythia-6.9b-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EleutherAI/pythia-6.9b
f271943e880e60c0c715fd10e4dc74ec4e31eb44
5.865842
apache-2.0
48
6
true
true
true
false
false
0.868867
0.228114
22.811363
0.323229
5.881632
0.008308
0.830816
0.251678
0.223714
0.359052
3.814844
0.114694
1.632683
true
2023-02-14
2024-06-12
0
EleutherAI/pythia-6.9b
Enno-Ai_EnnoAi-Pro-French-Llama-3-8B-v0.4_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-French-Llama-3-8B-v0.4-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
328722ae96e3a112ec900dbe77d410788a526c5c
15.180945
creativeml-openrail-m
0
8
true
true
true
false
true
1.009128
0.418881
41.888079
0.407495
16.875928
0.006042
0.60423
0.270973
2.796421
0.417
10.758333
0.263464
18.162677
false
2024-06-27
2024-06-30
0
Enno-Ai/EnnoAi-Pro-French-Llama-3-8B-v0.4
Enno-Ai_EnnoAi-Pro-Llama-3-8B_bfloat16
bfloat16
💬 chat models (RLHF, DPO, IFT, ...)
💬
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B
6a5d745bdd304753244fe601e2a958d37d13cd71
12.174667
creativeml-openrail-m
0
8
true
true
true
false
true
1.184337
0.319538
31.953772
0.415158
17.507545
0.001511
0.151057
0.261745
1.565996
0.407052
9.08151
0.215093
12.788121
false
2024-07-01
2024-07-08
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B
Enno-Ai_EnnoAi-Pro-Llama-3-8B-v0.3_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3-8B-v0.3-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
cf29b8b484a909132e3a1f85ce891d28347c0d13
17.524058
creativeml-openrail-m
0
8
true
true
true
false
true
1.470836
0.508257
50.825698
0.410058
16.668386
0.012085
1.208459
0.265101
2.013423
0.423573
12.313281
0.299036
22.1151
false
2024-06-26
2024-06-26
0
Enno-Ai/EnnoAi-Pro-Llama-3-8B-v0.3
Enno-Ai_EnnoAi-Pro-Llama-3.1-8B-v0.9_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/Enno-Ai__EnnoAi-Pro-Llama-3.1-8B-v0.9-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
c740871122fd471a1a225cf2b4368e333752d74c
14.945694
apache-2.0
0
8
true
true
true
false
true
0.932571
0.468915
46.89147
0.416027
17.498296
0
0
0.26594
2.12528
0.383177
5.430469
0.259558
17.72865
false
2024-08-22
2024-09-06
0
Enno-Ai/EnnoAi-Pro-Llama-3.1-8B-v0.9
EnnoAi_EnnoAi-Pro-Llama-3.1-8B-v1.0_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
LlamaForCausalLM
<a target="_blank" href="https://huggingface.co/EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0</a> <a target="_blank" href="https://huggingface.co/datasets/open-llm-leaderboard/EnnoAi__EnnoAi-Pro-Llama-3.1-8B-v1.0-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0
c740871122fd471a1a225cf2b4368e333752d74c
14.97109
apache-2.0
0
8
true
true
true
false
true
0.945642
0.470438
47.043844
0.416027
17.498296
0
0
0.26594
2.12528
0.383177
5.430469
0.259558
17.72865
false
2024-08-22
2024-09-06
0
EnnoAi/EnnoAi-Pro-Llama-3.1-8B-v1.0