--- base_model: - djuna/L3.1-Noraian - Casual-Autopsy/L3-Super-Nova-RP-8B - TheDrummer/Llama-3SOME-8B-v2 - kromeurus/L3.1-Aglow-Vulca-v0.1-8B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della merge method using [kromeurus/L3.1-Aglow-Vulca-v0.1-8B](https://huggingface.co/kromeurus/L3.1-Aglow-Vulca-v0.1-8B) as a base. ### Models Merged The following models were included in the merge: * [djuna/L3.1-Noraian](https://huggingface.co/djuna/L3.1-Noraian) * [Casual-Autopsy/L3-Super-Nova-RP-8B](https://huggingface.co/Casual-Autopsy/L3-Super-Nova-RP-8B) * [TheDrummer/Llama-3SOME-8B-v2](https://huggingface.co/TheDrummer/Llama-3SOME-8B-v2) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Casual-Autopsy/L3-Super-Nova-RP-8B parameters: density: 0.5 weight: 0.27 - model: TheDrummer/Llama-3SOME-8B-v2 parameters: density: 0.6 weight: 0.23 - model: djuna/L3.1-Noraian parameters: density: 0.4 weight: 0.28 merge_method: della base_model: kromeurus/L3.1-Aglow-Vulca-v0.1-8B parameters: epsilon: 0.1 lambda: 1.0 normalize: false int8_mask: true dtype: float32 out_dtype: bfloat16 ```