Edit model card

LogoS-7Bx2-MoE-13B-v0.1

Model built by @RubielLabarta using SLERP merge method. The model is release for research purposes only, commercial use is not allowed.

The LogoS is a model to experiment with the MoE method, which could significantly increase the performance of the original model. The model has 12.9B parameters.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 77.14
AI2 Reasoning Challenge (25-Shot) 74.49
HellaSwag (10-Shot) 89.07
MMLU (5-Shot) 64.74
TruthfulQA (0-shot) 74.57
Winogrande (5-shot) 88.32
GSM8k (5-shot) 71.65
Downloads last month
4,348
Safetensors
Model size
12.9B params
Tensor type
BF16
ยท
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2

Space using RubielLabarta/LogoS-7Bx2-MoE-13B-v0.2 1

Evaluation results