Weyaxi's picture
add merge tag (#3)
b7c2a40
metadata
license: apache-2.0
tags:
  - merge

image/png

⭐ UPDATE ⭐

Use this instead:

https://hf.co/Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp

image/png

OpenHermes-2.5-neural-chat-v3-2-Slerp

This is the model for OpenHermes-2.5-neural-chat-v3-2-Slerp. I used mergekit to merge models.

Prompt Templates

You can use these prompt templates, but I recommend using ChatML.

ChatML (OpenHermes-2.5-Mistral-7B):

<|im_start|>system
{system}<|im_end|>
<|im_start|>user
{user}<|im_end|>
<|im_start|>assistant
{asistant}<|im_end|>

neural-chat-7b-v3-2:

### System:
{system}
### User:
{user}
### Assistant:

Yaml Config to reproduce


slices:
  - sources:
      - model: teknium/OpenHermes-2.5-Mistral-7B
        layer_range: [0, 32]
      - model: Intel/neural-chat-7b-v3-2
        layer_range: [0, 32]
merge_method: slerp
base_model: mistralai/Mistral-7B-v0.1
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5 # fallback for rest of tensors
dtype: float16

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 70.2
ARC (25-shot) 67.49
HellaSwag (10-shot) 85.42
MMLU (5-shot) 64.13
TruthfulQA (0-shot) 61.05
Winogrande (5-shot) 80.3
GSM8K (5-shot) 63.08