File size: 882 Bytes
f66e1b2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- mistralai/Mistral-7B-Instruct-v0.2
- meta-math/MetaMath-Mistral-7B
- openchat/openchat-3.5-1210
---
# evo_exp-point-mix-linear
evo_exp-point-mix-linear is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
* [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B)
* [openchat/openchat-3.5-1210](https://huggingface.co/openchat/openchat-3.5-1210)
## 🧩 Configuration
```yaml
models:
- model: mistralai/Mistral-7B-Instruct-v0.2
parameters:
weight: 0.3
- model: meta-math/MetaMath-Mistral-7B
parameters:
weight: 0.3
- model: openchat/openchat-3.5-1210
parameters:
weight: 0.3
merge_method: linear
dtype: float16
``` |