dmayhem93 commited on
Commit
3b77dcd
1 Parent(s): 863ba93

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -22,9 +22,9 @@ Start chatting with `Stable Beluga 2` using the following code snippet:
22
  import torch
23
  from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
24
 
25
- tokenizer = AutoTokenizer.from_pretrained("stabilityai/FreeWilly2", use_fast=False)
26
- model = AutoModelForCausalLM.from_pretrained("stabilityai/FreeWilly2", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
27
- system_prompt = "### System:\nYou are Free Willy, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
28
 
29
  message = "Write me a poem please"
30
  prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
@@ -34,7 +34,7 @@ output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_t
34
  print(tokenizer.decode(output[0], skip_special_tokens=True))
35
  ```
36
 
37
- FreeWilly should be used with this prompt format:
38
  ```
39
  ### System:
40
  This is a system prompt, please behave and help the user.
@@ -42,7 +42,7 @@ This is a system prompt, please behave and help the user.
42
  ### User:
43
  Your prompt here
44
 
45
- ### Assistant
46
  The output of Stable Beluga 2
47
  ```
48
 
 
22
  import torch
23
  from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
24
 
25
+ tokenizer = AutoTokenizer.from_pretrained("stabilityai/StableBeluga2", use_fast=False)
26
+ model = AutoModelForCausalLM.from_pretrained("stabilityai/StableBeluga2", torch_dtype=torch.float16, low_cpu_mem_usage=True, device_map="auto")
27
+ system_prompt = "### System:\nYou are Stable Beluga, an AI that follows instructions extremely well. Help as much as you can. Remember, be safe, and don't do anything illegal.\n\n"
28
 
29
  message = "Write me a poem please"
30
  prompt = f"{system_prompt}### User: {message}\n\n### Assistant:\n"
 
34
  print(tokenizer.decode(output[0], skip_special_tokens=True))
35
  ```
36
 
37
+ Stable Beluga 2 should be used with this prompt format:
38
  ```
39
  ### System:
40
  This is a system prompt, please behave and help the user.
 
42
  ### User:
43
  Your prompt here
44
 
45
+ ### Assistant:
46
  The output of Stable Beluga 2
47
  ```
48