Azazelle's picture
Upload folder using huggingface_hub
77c64e2 verified
|
raw
history blame
2.52 kB
---
base_model:
- Sao10K/L3-8B-Stheno-v3.1
- openlynn/Llama-3-Soliloquy-8B-v2
- Nitral-AI/Poppy_Porpoise-1.0-L3-8B
- NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS
- flammenai/Mahou-1.0-llama3-8B
- failspy/Llama-3-8B-Instruct-MopeyMule
- Hastagaras/UltimateANJIR-8B-L3-Blackroot
- TheSkullery/llama-3-cat-8b-instruct-v1
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the breadcrumbs_ties merge method using [failspy/Llama-3-8B-Instruct-MopeyMule](https://huggingface.co/failspy/Llama-3-8B-Instruct-MopeyMule) as a base.
### Models Merged
The following models were included in the merge:
* [Sao10K/L3-8B-Stheno-v3.1](https://huggingface.co/Sao10K/L3-8B-Stheno-v3.1)
* [openlynn/Llama-3-Soliloquy-8B-v2](https://huggingface.co/openlynn/Llama-3-Soliloquy-8B-v2)
* [Nitral-AI/Poppy_Porpoise-1.0-L3-8B](https://huggingface.co/Nitral-AI/Poppy_Porpoise-1.0-L3-8B)
* [NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS](https://huggingface.co/NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS)
* [flammenai/Mahou-1.0-llama3-8B](https://huggingface.co/flammenai/Mahou-1.0-llama3-8B)
* [Hastagaras/UltimateANJIR-8B-L3-Blackroot](https://huggingface.co/Hastagaras/UltimateANJIR-8B-L3-Blackroot)
* [TheSkullery/llama-3-cat-8b-instruct-v1](https://huggingface.co/TheSkullery/llama-3-cat-8b-instruct-v1)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: failspy/Llama-3-8B-Instruct-MopeyMule
- model: flammenai/Mahou-1.0-llama3-8B # 7/10
parameters:
density: 0.4
weight: 0.14
- model: TheSkullery/llama-3-cat-8b-instruct-v1 # 6/10
parameters:
density: 0.3
weight: 0.1
- model: Nitral-AI/Poppy_Porpoise-1.0-L3-8B # 7/10
parameters:
density: 0.5
weight: 0.18
- model: openlynn/Llama-3-Soliloquy-8B-v2 # 8/10
parameters:
density: 0.5
weight: 0.18
- model: Hastagaras/UltimateANJIR-8B-L3-Blackroot # 6/10
parameters:
density: 0.3
weight: 0.1
- model: NeverSleep/Llama-3-Lumimaid-8B-v0.1-OAS # 7/10
parameters:
density: 0.4
weight: 0.14
- model: Sao10K/L3-8B-Stheno-v3.1 # 9/10
parameters:
density: 0.6
weight: 0.23
merge_method: breadcrumbs_ties
base_model: failspy/Llama-3-8B-Instruct-MopeyMule
parameters:
normalize: false
rescale: true
gamma: 0.01
dtype: float16
name: Peter
```