--- base_model: [] library_name: transformers tags: - mergekit - merge --- # V3_u_5_scaling This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the breadcrumbs_ties merge method using F:\merger\mergekit\V3_ultraperecision\V3_u_4_scaling as a base. ### Models Merged The following models were included in the merge: * F:\text_models\Iambe-RP-v3-20b ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: "F:\\merger\\mergekit\\V3_ultraperecision\\V3_u_4_scaling" - model: "F:\\text_models\\Iambe-RP-v3-20b" parameters: weight: 0.5 density: 0.9 gamma: 0.01 base_model: "F:\\merger\\mergekit\\V3_ultraperecision\\V3_u_4_scaling" merge_method: breadcrumbs_ties dtype: float32 ```