metadata
library_name: peft
base_model: HuggingFaceH4/zephyr-7b-beta
datasets:
- w95/databricks-dolly-15k-az
license: mit
language:
- az
Model Card for Model ID
This model was built via parameter-efficient finetuning of the HuggingFaceH4/zephyr-7b-beta base model on the first 8k rows in w95/databricks-dolly-15k-az.
Model Details
Model Description
- Developed by: Mammad Hajili
- Model type: Causal LM
- Language(s) (NLP): Azerbaijani
- License: mit
- Finetuned from model [optional]: HuggingFaceH4/zephyr-7b-beta
Training procedure
The following bitsandbytes
quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16
Framework versions
- PEFT 0.6.3.dev0
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM
config = PeftConfig.from_pretrained("hajili/zephyr-7b-beta-dolly-azerbaijani")
model = AutoModelForCausalLM.from_pretrained("HuggingFaceH4/zephyr-7b-beta")
model = PeftModel.from_pretrained(model, "hajili/zephyr-7b-beta-dolly-azerbaijani")