metadata
library_name: transformers
tags:
- mergekit
- merge
base_model:
- bluuwhale/L3-SAO-MIX-8B-V1
- Sao10K/L3-8B-Niitama-v1
- Sao10K/L3-8B-Lunaris-v1
- Sao10K/L3-8B-Tamamo-v1
- Sao10K/L3-8B-Stheno-v3.2
L3-bluuwhale-SAO-MIX-8B-V1_fp32-merge-calc
This is a remerge of bluuwhale's merger using the exact yaml config with the only difference being that merge calculations are done in fp32 instead of bfp16
Merge Details
Merge Method
This model was merged using the della merge method using Sao10K/L3-8B-Niitama-v1 as a base.
I've done this since I'm planning to use this for another merger, but you can use as is if you wish.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
models:
- model: Sao10K/L3-8B-Lunaris-v1
parameters:
weight: 1.0
- model: Sao10K/L3-8B-Stheno-v3.2
parameters:
weight: 1.0
- model: Sao10K/L3-8B-Niitama-v1
parameters:
weight: 1.0
- model: Sao10K/L3-8B-Tamamo-v1
parameters:
weight: 1.0
base_model: Sao10K/L3-8B-Niitama-v1
merge_method: della
dtype: float32
out_dtype: bfloat16