--- license: bigscience-bloom-rail-1.0 pipeline_tag: text-generation library_name: transformers tags: - dolly - bloomz - Spanish datasets: - dvilasuero/databricks-dolly-15k-es-deepl inference: false widget: - text: >- Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request. ### Instruction: Tell me about alpacas language: - es ---
Alpacoom logo
# DOLLOOM: Dolly 🐑 + BLOOMz 💮 ## Adapter Description This adapter was created with the [PEFT](https://github.com/huggingface/peft) library and allowed the base model **BigScience/BLOOMz 7B1** to be fine-tuned on the **Dolly's Dataset (tanslated to Spanish)** by using the method **LoRA**. ## Model Description Instruction Tuned version of BigScience Large Open-science Open-access Multilingual. [BLOOMz 7B1 MT](https://huggingface.co/bigscience/bloomz-7b1-mt) ## Training data TBA ### Supported Tasks and Leaderboards TBA ### Training procedure TBA ## How to use TBA