--- base_model: - kromeurus/L3-Blackened-Sunfall-15B - Hastagaras/Jamet-8B-L3-MK.V-Blackroot - TheDrummer/Llama-3SOME-8B-v2 library_name: transformers tags: - mergekit - merge - not-for-all-audiences --- ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/667eea5cdebd46a5ec4dcc3d/HzAhXawzvRnvlmatPrwld.jpeg) Well, this merge didn't go as expected, at all. Went in trying to make an 8B downscale of [Blackfall Summanus](https://huggingface.co/kromeurus/L3-Blackfall-Summanus-v0.1-15B) and a comical amount of dumb mistakes later, managed to make this surprisingly solid merge. I don't know either, I'm still processing how this model exists because I fat-fingered my keyboard. Anyways, here is Sammanus Ara. Please look at the original model card for more details. ### Quants [OG Q8_0 GGUF](https://huggingface.co/kromeurus/L3-8.9B-Blackfall-SummanusAra-v0.1-Q8-GGUF) by me. [GGUFs](https://huggingface.co/backyardai/L3-8.9B-Blackfall-SummanusAra-v0.1-GGUF) by [BackyardAI](https://huggingface.co/backyardai) [GGUFs](https://huggingface.co/mradermacher/L3-8.9B-Blackfall-SummanusAra-v0.1-GGUF) by [mradermacher](https://huggingface.co/mradermacher) [imatrix GGUFs](https://huggingface.co/mradermacher/L3-8.9B-Blackfall-SummanusAra-v0.1-i1-GGUF) by [mradermacher](https://huggingface.co/mradermacher) ### Details & Recommended Settings Compared to the OG 15B version, BF Summanus Ara is surprisingly capable for it's size while keeping most of the original attributes. Obviously, won't be as verbose or nuanced due to natural limitations though no less eloquent. A little more precise and coherent, somehow sticks to the example text to a T exactly like Aethora v2 despite not adding it into the merge. Not as chatty as expected with the additional models, paces itself quite well unless promted otherwise. Overall, very close to the OG in all the important aspects. Does amazing in RP and eRP, leaning more narrative driven and story heavy for best results. Rec. Settings: ``` Template: Model Default Temperature: 1.3 Min P: 0.08 Repeat Penelty: 1.05 Repeat Penelty Tokens: 256 ``` ### Models Merged * [kromeurus/L3-Blackened-Sunfall-15B](https://huggingface.co/kromeurus/L3-Blackened-Sunfall-15B) * [Hastagaras/Jamet-8B-L3-MK.V-Blackroot](https://huggingface.co/Hastagaras/Jamet-8B-L3-MK.V-Blackroot) * [TheDrummer/Llama-3SOME-8B-v2](https://huggingface.co/TheDrummer/Llama-3SOME-8B-v2) * [crestf411/L3-8B-sunfall-v0.4-stheno-v3.2](https://huggingface.co/crestf411/L3-8B-sunfall-v0.4-stheno-v3.2) I first made passthrough merges of the models listed above into separate parts that has aspects of what I wanted in the final model, then did a ties merge with said parts as seen below. ### Configs summanus.ds.9b: ```yaml models: slices: - sources: - layer_range: [0, 28] model: kromeurus/L3-Blackfall-Summanus-v0.1-15B - sources: - layer_range: [56, 64] model: kromeurus/L3-Blackfall-Summanus-v0.1-15B parameters: int8_mask: true merge_method: passthrough dtype: bfloat16 ``` summanusara.atp1: ```yaml models: slices: - sources: - layer_range: [0, 8] model: crestf411/L3-8B-sunfall-v0.4-stheno-v3.2 - sources: - layer_range: [8, 16] model: TheDrummer/Llama-3SOME-8B-v2 - sources: - layer_range: [16, 24] model: Hastagaras/Jamet-8B-L3-MK.V-Blackroot - sources: - layer_range: [22, 26] model: TheDrummer/Llama-3SOME-8B-v2 - sources: - layer_range: [24, 32] model: crestf411/L3-8B-sunfall-v0.4-stheno-v3.2 parameters: int8_mask: true merge_method: passthrough dtype: bfloat16 ``` final: ```yaml models: - model: parts/summanus.ds.9b # No parameters necessary for base model - model: parts/summanusara.atp1 parameters: density: [0.33, 0.01, 0.33] weight: 0.8 gamma: 0.001 merge_method: breadcrumbs base_model: parts/summanus.ds.9b parameters: normalize: true int8_mask: true dtype: bfloat16 ```