metadata
base_model:
- Gryphe/Pantheon-RP-1.0-8b-Llama-3
- Sao10K/L3-8B-Stheno-v3.2
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the DARE TIES merge method using Gryphe/Pantheon-RP-1.0-8b-Llama-3 as a base.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- layer_range: [0, 16]
model: Sao10K/L3-8B-Stheno-v3.2
parameters:
density: 0.5
weight: 1.0
- layer_range: [0, 16]
model: Gryphe/Pantheon-RP-1.0-8b-Llama-3
parameters:
density: 0.5
weight: 0.9
- sources:
- layer_range: [16, 32]
model: Sao10K/L3-8B-Stheno-v3.2
parameters:
density: 0.5
weight: 0.9
- layer_range: [16, 32]
model: Gryphe/Pantheon-RP-1.0-8b-Llama-3
parameters:
density: 0.5
weight: 1.0
merge_method: dare_ties
tokenizer_source: base
base_model: Gryphe/Pantheon-RP-1.0-8b-Llama-3
parameters:
int8_mask: true
dtype: bfloat16