metadata
license: apache-2.0
language:
- en
library_name: transformers
A quick multi-disciplinary fine tune of openchat/openchat-3.5-0106 using an alpaca-style dataset across different disciplines. I used LORA adapters that I then merged back into the main model for ease of use.