SG-Raccoon-Yi-55B / README.md
mlinmg's picture
Update README.md
e9baa9d
|
raw
history blame
1.83 kB
metadata
language:
  - en,
pipeline_tag: conversational
library_name: adapter-transformers
license: mit


SG Raccoon Yi 55B

An auto-regressive causal LM created by combining 2x finetuned Yi 34b into one.

Prompting Format

single-turn: <|startoftext|>Human: Hello!\n\nAssistant: <|endoftext|>

multi-turn: <|startoftext|>Human: Hello!\n\nAssistant: <|endoftext|>Hi!<|endoftext|>Human: How are you?\n\nAssistant: <|endoftext|>target2<|endoftext|>

Merge process

The models used in the merge are dolphin-2_2-yi-34b and OrionStar-Yi-34B-Chat-Llama.

The layer ranges used are as follows:

- range 0, 16
OrionStar-Yi-34B-Chat
- range 8, 24
dolphin-2_2-yi-34b
- range 17, 32
OrionStar-Yi-34B-Chat
- range 25, 40
dolphin-2_2-yi-34b
- range 33, 48
OrionStar-Yi-34B-Chat
- range 41, 56
dolphin-2_2-yi-34b
- range 49, 64
OrionStar-Yi-34B-Chat
- range 57, 72
dolphin-2_2-yi-34b
- range 65, 80
OrionStar-Yi-34B-Chat

Benchmarks

Coming soon.

Acknowledgements

  • Special thanks to MSS for sponsoring this project

  • @chargoddard for developing the framework used to merge the model - mergekit.

  • Great thanks to @Undi95 for helping figuring out model merge options

  • Also credits to the 01-ai team for their amazing models

  • This merged model is inspired by Goliath 120B