LLaMA model finetuned (1 epoch) on the Stanford Alpaca training data set and quantized to 4bit.