PsyMedLewd / README.md
Elfrino's picture
Update README.md
4c5caae verified
metadata
base_model:
  - Undi95/MXLewd-L2-20B
  - Undi95/PsyMedRP-v1-20B
library_name: transformers
tags:
  - mergekit
  - merge

merge

PsyMedLewd. A merge of two of my favourite models for scifi stories:

Currently testing more merge iterations of these two models.

Warning: Cute alien girls inside!

RECOMMENDED SETTINGS FOR ALL PsyMedLewd VERSIONS

(based on KoboldCPP):

Temperature - 1.3

Max Ctx. Tokens - 4096

Top p Sampling - 0.99

Repetition Penalty - 1.09

Amount to Gen. - 512

Prompt template: Alpaca or ChatML

First iteration. More to come..

#########################################################################################################

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the SLERP merge method.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

slices:
  - sources:
      - model: Undi95/PsyMedRP-v1-20B
        layer_range: [0, 62]  # PsyMedRP has 62 layers
      - model: Undi95/MXLewd-L2-20B
        layer_range: [0, 62]  # MXLewd has 62 layers
merge_method: slerp  # Or use another method like weight_average if needed
base_model: Undi95/MXLewd-L2-20B  # Can use either as the base model
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]  # Tune these for desired effect
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5  # Default averaging weight
dtype: bfloat16  # Use preferred dtype, like fp16 or float32 if needed