Theros's picture
Upload folder using huggingface_hub
6e1347c verified
metadata
base_model:
  - SvalTek/Gemma-7B-ColdBrew-RP
  - KishoreK/ActionGemma-9B
library_name: transformers
tags:
  - mergekit
  - merge

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

merge_method: slerp
base_model: "SvalTek/Gemma-7B-ColdBrew-RP"
slices:
  - sources:
      - model: "SvalTek/Gemma-7B-ColdBrew-RP"
        layer_range: [0, 42]
      - model: "KishoreK/ActionGemma-9B"
        layer_range: [0, 42]
    parameters:
      t:
        - filter: self_attn  # coherence from ActionGemma
          value: [0.3, 0.5, 0.7, 0.9, 1]
        - filter: mlp  # creativity from ColdBrew
          value: [0.7, 0.5, 0.3, 0.1, 0]
        - filter: layer_norm  # consistency from ActionGemma
          value: [0.4, 0.6, 0.8, 1, 1]
        - filter: pos_embed  # sequence understanding from ActionGemma
          value: [0.5, 0.7, 0.9, 1, 1]
        - value: 0.5
dtype: bfloat16
tokenizer_source: base