metadata
base_model:
- sreeramajay/TinyLlama-1.1B-orca-v1.0
- TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
- ShieldX/manovyadh-1.1B-v1-chat
- l3utterfly/tinyllama-1.1b-layla-v4
- AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling
- TinyLlama/TinyLlama-1.1B-Chat-v1.0
- vihangd/DopeyTinyLlama-1.1B-v1
- raidhon/coven_tiny_1.1b_32k_orpo_alpha
- appvoid/palmer-003
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the task arithmetic merge method using TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T as a base.
Models Merged
The following models were included in the merge:
- sreeramajay/TinyLlama-1.1B-orca-v1.0
- ShieldX/manovyadh-1.1B-v1-chat
- l3utterfly/tinyllama-1.1b-layla-v4
- AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling
- TinyLlama/TinyLlama-1.1B-Chat-v1.0
- vihangd/DopeyTinyLlama-1.1B-v1
- raidhon/coven_tiny_1.1b_32k_orpo_alpha
- appvoid/palmer-003
Configuration
The following YAML configuration was used to produce this model:
models:
- model: vihangd/DopeyTinyLlama-1.1B-v1
parameters:
density: 0.50
weight: 0.50
- model: raidhon/coven_tiny_1.1b_32k_orpo_alpha
parameters:
density: 0.66
weight: 0.26
- model: l3utterfly/tinyllama-1.1b-layla-v4
parameters:
density: 0.30
weight: 0.125
- model: ShieldX/manovyadh-1.1B-v1-chat
parameters:
density: 0.18
weight: 0.125
- model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
parameters:
density: 0.40
weight: 0.25
- model: sreeramajay/TinyLlama-1.1B-orca-v1.0
parameters:
density: 0.35
weight: 0.37
- model: AIGym/TinyLlama-1.1B-2.5T-chat-and-function-calling
parameters:
density: 0.25
weight: 0.26
- model: appvoid/palmer-003
parameters:
density: 0.90
weight: 0.75
merge_method: task_arithmetic
base_model: TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T
parameters:
normalize: false
int8_mask: true
dtype: float16