Mistral-7B-v0.1-ties
Mistral-7B-v0.1-ties is a merge of the following models using mergekit:
🧩 Configuration
```yaml models:
- model: mistralai/Mistral-7B-v0.1
no parameters necessary for base model
- model: mlabonne/NeuralHermes-2.5-Mistral-7B parameters: density: 0.5 weight: 0.3 merge_method: ties base_model: mistralai/Mistral-7B-v0.1 parameters: normalize: true dtype: float16
```
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.