llama-2-7b-guanaco / README.md
mlabonne's picture
Create README.md
ad25609
|
raw
history blame
625 Bytes
metadata
license: apache-2.0
datasets:
  - timdettmers/openassistant-guanaco
pipeline_tag: text-generation

Model fine-tuned in 4-bit precision using QLoRA on timdettmers/openassistant-guanaco with weights merged after training.

Made using this Google Colab notebook.

It can be easily imported using the AutoModelForCausalLM class from transformers:

from transformers import AutoModelForCausalLM

model = AutoModelForCausalLM("mlabonne/llama-2-7b-guanaco")