license: mit | |
datasets: | |
- NobodyExistsOnTheInternet/GiftedConvoBeforeEcons | |
Trained on over 20k instruct generated all by gpt-4 or humans | |
Dataset features: | |
1000 long evolved conversations based off LIMA | |
Subsection of correct PRM800k data | |
Subsection of CamelAI's Physics and Chemistry data | |
The model is trained with Qlora as well as Axolotl. | |
The model format is Vicuna 1.1: | |
``` | |
User: ... | |
Assistant: ... | |
``` |