Text Generation
Transformers
PyTorch
mistral
Not-For-All-Audiences
nsfw
text-generation-inference
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -21,7 +21,9 @@ The dataset consists of 5800 samples, with the composition as follows:
|
|
21 |
* Norquinal/claude_multiround_chat_1k (~17%)
|
22 |
* jundurbin/airoboros-gpt4-1.4 (~17%)
|
23 |
* totally-not-an-llm/EverythingLM-data-V2-sharegpt (~17%)
|
24 |
-
|
|
|
|
|
25 |
## Prompt Format
|
26 |
The model was finetuned with a prompt format similar to the original SuperHOT prototype:
|
27 |
```
|
|
|
21 |
* Norquinal/claude_multiround_chat_1k (~17%)
|
22 |
* jundurbin/airoboros-gpt4-1.4 (~17%)
|
23 |
* totally-not-an-llm/EverythingLM-data-V2-sharegpt (~17%)
|
24 |
+
|
25 |
+
These samples were then back-filled using gpt-4/gpt-3.5-turbo-16k or otherwise converted to fit the prompt format.
|
26 |
+
|
27 |
## Prompt Format
|
28 |
The model was finetuned with a prompt format similar to the original SuperHOT prototype:
|
29 |
```
|