Llama-3-6B-Instruct-pruned / mergekit_config.yml
kuotient's picture
Upload folder using huggingface_hub
fae8f71 verified
raw
history blame contribute delete
270 Bytes
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 21]
model:
model:
path: meta-llama/Meta-Llama-3-8B-Instruct
- sources:
- layer_range: [29, 32]
model:
model:
path: meta-llama/Meta-Llama-3-8B-Instruct