Edit model card

Jokestral

This model was created by fine-tuning unsloth/mistral-7b-v0.3-bnb-4bit on Short jokes dataset. So the only purpose of this model is the generation of cringe jokes.
Just write the first few words and get your joke.

Usage

Goodle Colab example

pip install transformers
pip install --no-deps "trl<0.9.0" peft accelerate bitsandbytes
from transformers import AutoTokenizer,AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("SantaBot/Jokestral_4bit",)
tokenizer = AutoTokenizer.from_pretrained("SantaBot/Jokestral_4bit")

inputs = tokenizer(
[
    "My doctor" # YOUR PROMPT HERE
], return_tensors = "pt").to("cuda")

outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
tokenizer.batch_decode(outputs)

The output should be something like :
['<s> My doctor told me I have to stop m4sturb4t1ng. I asked him why and he said ""Because I\'m trying to examine you.""\n</s>']

Downloads last month
10
Safetensors
Model size
3.87B params
Tensor type
F32
FP16
U8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SantaBot/Jokestral_4bit

Quantized
(116)
this model