LLaMa-3-8B-First-Layer / mergekit_config.yml
chargoddard's picture
Upload folder using huggingface_hub
5db37e3 verified
raw
history blame contribute delete
124 Bytes
merge_method: passthrough
dtype: bfloat16
slices:
- sources:
- layer_range: [0, 1]
model: NousResearch/Meta-Llama-3-8B