File size: 305 Bytes
5cfa8b9 03a3256 |
1 2 3 4 5 6 7 8 9 10 11 |
---
license: openrail
---
Base model:
https://huggingface.co/TheBloke/wizardLM-7B-HF/tree/main
Model trained on the following data:
https://huggingface.co/datasets/gmongaras/reddit_negative
Trained for about 700 steps with a batch size of 8, 2 accumulation steps, and using LoRA adapters on all layers. |