--- pipeline_tag: text-generation tags: - mistral - merge license: cc-by-4.0 --- # Model Card for Moko-DARE Part of a merge method experiment. .yaml file for mergekit ```.yaml: models: - model: Open-Orca/Mistral-7B-OpenOrca parameters: density: [1, 0.7, 0.1] # density gradient weight: 1.0 - model: akjindal53244/Mistral-7B-v0.1-Open-Platypus parameters: density: 0.5 weight: [0, 0.3, 0.7, 1] # weight gradient - model: WizardLM/WizardMath-7B-V1.1 parameters: density: 0.33 weight: - filter: mlp value: 0.5 - value: 0 merge_method: dare_ties base_model: mistralai/Mistral-7B-v0.1 parameters: normalize: true int8_mask: true dtype: float16 ```