LoneStriker's picture
ExLLaMA V2 quant of Yi-34B-GiftedConvo-merged-4.0bpw-h6-exl2
e2823a6
|
raw
history blame
439 Bytes
---
license: mit
datasets:
- NobodyExistsOnTheInternet/GiftedConvoBeforeEcons
---
Trained on over 20k instruct generated all by gpt-4 or humans
Dataset features:
1000 long evolved conversations based off LIMA
Subsection of correct PRM800k data
Subsection of CamelAI's Physics and Chemistry data
The model is trained with Qlora as well as Axolotl.
The model format is Vicuna 1.1:
```
User: ...
Assistant: ...
```