Moza-7B-v1.0 / README.md
kidyu's picture
Update README.md
a02753e verified
|
raw
history blame
3.6 kB
metadata
base_model:
  - mistralai/Mistral-7B-v0.1
  - cognitivecomputations/dolphin-2.2.1-mistral-7b
  - Open-Orca/Mistral-7B-OpenOrca
  - openchat/openchat-3.5-0106
  - mlabonne/NeuralHermes-2.5-Mistral-7B
  - GreenNode/GreenNode-mini-7B-multilingual-v1olet
  - berkeley-nest/Starling-LM-7B-alpha
  - viethq188/LeoScorpius-7B-Chat-DPO
  - meta-math/MetaMath-Mistral-7B
  - Intel/neural-chat-7b-v3-3
library_name: transformers
inference: false
tags:
  - mergekit
  - merge

Moza-7B-v1.0

image/png

This is a meme-merge of pre-trained language models, created using mergekit. Use at your own risk.

Details

Quantized Model

Merge Method

This model was merged using the DARE TIES merge method, using mistralai/Mistral-7B-v0.1 as a base.

The value for density are from this blogpost, and the weight was randomly generated and then assigned to the models, with priority (of using the bigger weight) to NeuralHermes, OpenOrca, and neural-chat. The models themselves are chosen by "vibes".

Models Merged

The following models were included in the merge:

Prompt Format

You can use Alpaca formatting for inference

### Instruction:

### Response:

Configuration

The following YAML configuration was used to produce this model:

base_model: mistralai/Mistral-7B-v0.1
models:
  - model: mlabonne/NeuralHermes-2.5-Mistral-7B
    parameters:
      density: 0.63
      weight: 0.83
  - model: Intel/neural-chat-7b-v3-3
    parameters:
      density: 0.63
      weight: 0.74
  - model: meta-math/MetaMath-Mistral-7B
    parameters:
      density: 0.63
      weight: 0.22
  - model: openchat/openchat-3.5-0106
    parameters:
      density: 0.63
      weight: 0.37
  - model: Open-Orca/Mistral-7B-OpenOrca
    parameters:
      density: 0.63
      weight: 0.76
  - model: cognitivecomputations/dolphin-2.2.1-mistral-7b
    parameters:
      density: 0.63
      weight: 0.69
  - model: viethq188/LeoScorpius-7B-Chat-DPO
    parameters:
      density: 0.63
      weight: 0.38
  - model: GreenNode/GreenNode-mini-7B-multilingual-v1olet
    parameters:
      density: 0.63
      weight: 0.13
  - model: berkeley-nest/Starling-LM-7B-alpha
    parameters:
      density: 0.63
      weight: 0.33
merge_method: dare_ties
parameters:
  normalize: true
  int8_mask: true
dtype: bfloat16