Update README.md
Browse files
README.md
CHANGED
@@ -14,10 +14,32 @@ tags:
|
|
14 |
|
15 |
# Uploaded model
|
16 |
|
17 |
-
|
18 |
-
|
19 |
-
|
20 |
|
21 |
-
|
22 |
|
23 |
-
[
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
# Uploaded model
|
16 |
|
17 |
+
This model was created by fine-tuning `unsloth/mistral-7b-v0.3-bnb-4bit` on [Short jokes dataset](https://www.kaggle.com/datasets/abhinavmoudgil95/short-jokes).
|
18 |
+
So the only purpose of this model is the generation of cringe jokes. </br>
|
19 |
+
Just write the first few words and get your joke.
|
20 |
|
21 |
+
# Usage
|
22 |
|
23 |
+
[Goodle Colab example](https://colab.research.google.com/drive/13N1O-fq-vjr8FUrsUU6f24fPpyf0ZwOS#scrollTo=UBSG1UTV85Vq)
|
24 |
+
|
25 |
+
```
|
26 |
+
pip install transformers
|
27 |
+
pip install --no-deps "trl<0.9.0" peft accelerate bitsandbytes
|
28 |
+
```
|
29 |
+
```
|
30 |
+
from transformers import AutoTokenizer,AutoModelForCausalLM
|
31 |
+
|
32 |
+
model = AutoModelForCausalLM.from_pretrained("SantaBot/Jokestral_4bit",)
|
33 |
+
tokenizer = AutoTokenizer.from_pretrained("SantaBot/Jokestral_4bit")
|
34 |
+
|
35 |
+
inputs = tokenizer(
|
36 |
+
[
|
37 |
+
"My doctor" # YOUR PROMPT HERE
|
38 |
+
], return_tensors = "pt").to("cuda")
|
39 |
+
|
40 |
+
outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
|
41 |
+
tokenizer.batch_decode(outputs)
|
42 |
+
```
|
43 |
+
|
44 |
+
**The output should be something like** : </br>
|
45 |
+
`['<s> My doctor told me I have to stop m4sturb4t1ng. I asked him why and he said ""Because I\'m trying to examine you.""\n</s>']`
|