--- base_model: [] library_name: transformers tags: - mergekit - merge --- # model This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the breadcrumbs_ties merge method using Z:\peter\LLM's\Llama-3-Giraffe-70B-Instruct as a base. ### Models Merged The following models were included in the merge: * Z:\peter\LLM's\Smaug-Llama-3-70B-Instruct * I:\Llama-3-Lumimaid-70B-v0.1-alt * I:\Tess-2.0-Llama-3-70B-v0.2 ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Z:\peter\LLM's\Llama-3-Giraffe-70B-Instruct parameters: weight: 0.25 density: 0.90 gamma: 0.01 - model: Z:\peter\LLM's\Smaug-Llama-3-70B-Instruct parameters: weight: 0.30 density: 0.90 gamma: 0.01 - model: I:\Tess-2.0-Llama-3-70B-v0.2 parameters: weight: 0.15 density: 0.90 gamma: 0.01 - model: I:\Llama-3-Lumimaid-70B-v0.1-alt parameters: weight: 0.30 density: 0.90 gamma: 0.01 merge_method: breadcrumbs_ties base_model: Z:\peter\LLM's\Llama-3-Giraffe-70B-Instruct dtype: bfloat16 ```