Edit model card

To run the model run the following code :

!pip install ctransformers[cuda]
from ctransformers import AutoModelForCausalLM

llm = AutoModelForCausalLM.from_pretrained("epsil/Tinyllama-1b-v1.0-gguf", model_file="Tinyllama-1b-v1.0.gguf")

print(llm("AI is going to"))
Downloads last month
1
GGUF
Model size
1.1B params
Architecture
llama
Inference API
Unable to determine this model's library. Check the docs .