Update README.md
Browse files
README.md
CHANGED
@@ -31,10 +31,10 @@ If you are using GGUF I strongly advise using ChatML, for some reason that quant
|
|
31 |
"<s>[INST] Prompt goes here [/INST]<\s>"
|
32 |
```
|
33 |
### Context and Instruct
|
34 |
-
~~[Mistral-Custom-Context.json](https://files.catbox.moe/l9w0ry.json)~~<br/>
|
35 |
-
~~[Mistral-Custom-Instruct.json](https://files.catbox.moe/9xiiwb.json)~~ <br/>
|
36 |
[Magnum-123B-Context.json](https://files.catbox.moe/rkyqwg.json) <br/>
|
37 |
[Magnum-123B-Instruct.json](https://files.catbox.moe/obb5oe.json) <br/>
|
|
|
|
|
38 |
*** NOTE *** <br/>
|
39 |
There have been reports of the quantized model misbehaving with the mistral prompt, if you are seeing issues it may be worth trying ChatML Context and Instruct templates.
|
40 |
|
|
|
31 |
"<s>[INST] Prompt goes here [/INST]<\s>"
|
32 |
```
|
33 |
### Context and Instruct
|
|
|
|
|
34 |
[Magnum-123B-Context.json](https://files.catbox.moe/rkyqwg.json) <br/>
|
35 |
[Magnum-123B-Instruct.json](https://files.catbox.moe/obb5oe.json) <br/>
|
36 |
+
~~[Mistral-Custom-Context.json](https://files.catbox.moe/l9w0ry.json)~~<br/>
|
37 |
+
~~[Mistral-Custom-Instruct.json](https://files.catbox.moe/9xiiwb.json)~~ <br/>
|
38 |
*** NOTE *** <br/>
|
39 |
There have been reports of the quantized model misbehaving with the mistral prompt, if you are seeing issues it may be worth trying ChatML Context and Instruct templates.
|
40 |
|