gelukuMLG's picture
Update README.md
8a0dccd verified
|
raw
history blame
657 Bytes
---
license: cc-by-nc-4.0
tags:
- merge
- mergekit
- text-generation-inference
- transformers
---
This is a merge of Fimbulvet-V2 and Kuro-Lotus using the following recipe:
```
slices:
- sources:
- model: saishf/Kuro-Lotus-10.7B
layer_range: [0, 48]
- model: Sao10K/Fimbulvetr-11B-v2
layer_range: [0, 48]
merge_method: slerp
base_model: saishf/Kuro-Lotus-10.7B
parameters:
t:
- filter: self_attn
value: [0.6, 0.7, 0.8, 0.9, 1]
- filter: mlp
value: [0.4, 0.3, 0.2, 0.1, 0]
- value: 0.5
dtype: bfloat16
```
Models used:
Fimbulvetr-V2-11B: https://huggingface.co/Sao10K/Fimbulvetr-11B-v2 /n
Kuro-Lotus-10.7B: https://huggingface.co/saishf/Kuro-Lotus-10.7B