--- base_model: - allura-org/MS-Meadowlark-22B - TheDrummer/UnslopSmall-22B-v1 - Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small - ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1 library_name: transformers tags: - mergekit - merge --- # SchisandraVA2 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della_linear merge method using [TheDrummer/UnslopSmall-22B-v1](https://huggingface.co/TheDrummer/UnslopSmall-22B-v1) as a base. ### Models Merged The following models were included in the merge: * [allura-org/MS-Meadowlark-22B](https://huggingface.co/allura-org/MS-Meadowlark-22B) * Step1 * [Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small](https://huggingface.co/Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small) * [ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1](https://huggingface.co/ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1) * Step2 ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: della_linear dtype: bfloat16 parameters: normalize: true int8_mask: true tokenizer_source: base base_model: TheDrummer/UnslopSmall-22B-v1 models: - model: ArliAI/Mistral-Small-22B-ArliAI-RPMax-v1.1 parameters: density: 0.55 weight: 1 - model: Gryphe/Pantheon-RP-Pure-1.6.2-22b-Small parameters: density: 0.55 weight: 1 - model: Step1 parameters: density: 0.55 weight: 1 - model: allura-org/MS-Meadowlark-22B parameters: density: 0.55 weight: 1 - model: Step2 parameters: density: 0.55 weight: 1 ```