---
base_model:
- nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
- TheDrummer/Cydonia-22B-v1
library_name: transformers
tags:
- mergekit
- merge
---
A merge of Cydonia came out and while I'm happy with Cydonia as a current daily-driver kind of model, I'd be remiss if I didn't keep an eye out for something new that might be fun.
[This is the EXL2 6bpw Quant of this model](https://huggingface.co/DazzlingXeno/Cydonian-Gutenberg)
[For the 8bpw version, go here.](https://huggingface.co/Statuo/MS-Cydonian-Gutenberg-22b-8bpw)
[For the 4bpw version, go here.](https://huggingface.co/Statuo/MS-Cydonian-Gutenberg-22b-4bpw)
A merge of Cydonia and Mistral Small Gutenberg.
This will hopefully make it and even better story teller.
Mistral or ChatML format.
Much appreciation to the original model creators The Drummer and nbeerbower.
https://ko-fi.com/dazzlingxeno
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the SLERP merge method.
### Models Merged
The following models were included in the merge:
* [nbeerbower/Mistral-Small-Gutenberg-Doppel-22B](https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B)
* [TheDrummer/Cydonia-22B-v1](https://huggingface.co/TheDrummer/Cydonia-22B-v1)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
dtype: bfloat16
merge_method: slerp
parameters:
t:
- filter: self_attn
value: [0.0, 0.5, 0.3, 0.7, 1.0]
- filter: mlp
value: [1.0, 0.5, 0.7, 0.3, 0.0]
- value: 0.5
slices:
- sources:
- layer_range: [0, 56]
model: TheDrummer/Cydonia-22B-v1
- layer_range: [0, 56]
model: nbeerbower/Mistral-Small-Gutenberg-Doppel-22B
```