pankajmathur commited on
Commit
59cd5d1
1 Parent(s): d136f0e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -5
README.md CHANGED
@@ -21,18 +21,17 @@ To get started with this incredible model, just use the ChatML prompt template a
21
 
22
  ```
23
  <|im_start|>system
24
- {system}<|im_end|>
25
  <|im_start|>user
26
- {user}<|im_end|>
27
  <|im_start|>assistant
28
- {asistant}<|im_end|>
29
  ```
30
 
31
  ChatML prompt template is available as a chat template, which means you can format messages using the tokenizer.apply_chat_template() method:
32
  ```
33
  messages = [
34
- {"role": "system", "content": "You are helpful AI asistant."},
35
- {"role": "user", "content": "Hello!"}
36
  ]
37
 
38
  gen_input = tokenizer.apply_chat_template(message, return_tensors="pt")
 
21
 
22
  ```
23
  <|im_start|>system
24
+ You are Dolphin, a helpful AI assistant.<|im_end|>
25
  <|im_start|>user
26
+ {prompt}<|im_end|>
27
  <|im_start|>assistant
 
28
  ```
29
 
30
  ChatML prompt template is available as a chat template, which means you can format messages using the tokenizer.apply_chat_template() method:
31
  ```
32
  messages = [
33
+ {"role": "system", "content": "You are BrokeLlama, a helpful AI assistant."},
34
+ {"role": "user", "content": "Hello BrokenLlama, what can you do for me?"}
35
  ]
36
 
37
  gen_input = tokenizer.apply_chat_template(message, return_tensors="pt")