metadata
license: openrail
Base model: https://huggingface.co/TheBloke/wizardLM-7B-HF/tree/main
Model trained on the following data: https://huggingface.co/datasets/gmongaras/reddit_negative
Trained for about 700 steps with a batch size of 8, 2 accumulation steps, and using LoRA adapters on all layers.