kromeurus's picture
Update README.md
471f058 verified
|
raw
history blame
2.86 kB
metadata
base_model:
  - kromeurus/L3-Blackened-Sunfall-15B
  - Hastagaras/Jamet-8B-L3-MK.V-Blackroot
  - TheDrummer/Llama-3SOME-8B-v2
library_name: transformers
tags:
  - mergekit
  - merge
  - not-for-all-audiences

image/jpeg

Well, this merge didn't go as expected, at all. Went in trying to make an 8B downscale of Blackfall Summanus and a comical amount of dumb mistakes later, managed to make this surprisingly solid merge. I don't know either, I'm still processing how this model exists because I fat-fingered my keyboard. Anyways, here is Sammanus Ara. Please look at the original model card for more details.

Quants

Q8_0 only GGUF by me.

GGUFs by BackyardAI

imatrix quants are not available yet.

Details & Recommended Settings

(Still testing; not finalized)

Compared to the OG 15B version, BF Summanus Ara is surprisingly capable for it's size while keeping most of the original attributes. Obviously, won't be as verbose or nuanced due to natural limitations though no less eloquent. A little more precise and coherent, somehow sticks to the example text to a T exactly like Aethora v2 despite not adding it into the merge. Not as chatty as expected with the additional models, paces itself quite well unless promted otherwise.

Overall, very close to the OG in all the important aspects. Does amazing in RP and eRP, leaning more narrative driven and story heavy for best results.

Rec. Settings:

Template: Model Default
Temperature: 1.3
Min P: 0.08
Repeat Penelty: 1.05
Repeat Penelty Tokens: 256

Models Merged

I first made passthrough merges of the models listed above into separate parts that has aspects of what I wanted in the final model, then did a ties merge with said parts as seen below.

Configs

models:
  - model: parts/summanus.ds.9b
    # No parameters necessary for base model
  - model: parts/summanusara.atp1
    parameters:
      density: [0.33, 0.01, 0.33]
      weight: 0.8
      gamma: 0.001
merge_method: breadcrumbs
base_model: parts/summanus.ds.9b
parameters:
  normalize: true
  int8_mask: true
dtype: bfloat16