nroggendorff commited on
Commit
78f7207
1 Parent(s): 2686a80

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -19,7 +19,7 @@ model-index:
19
 
20
  # Edgar Allen Poe LLM
21
 
22
- Mayo is a language model fine-tuned on the [EAP dataset](https://huggingface.co/datasets/nroggendorff/eap) using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the [Mistral 7b Model](mistralai/Mistral-7B-Instruct-v0.3)
23
 
24
  ## Features
25
 
@@ -28,7 +28,7 @@ Mayo is a language model fine-tuned on the [EAP dataset](https://huggingface.co/
28
 
29
  ## Usage
30
 
31
- To use the Mayo LLM, you can load the model using the Hugging Face Transformers library:
32
 
33
  ```python
34
  from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
 
19
 
20
  # Edgar Allen Poe LLM
21
 
22
+ EAP is a language model fine-tuned on the [EAP dataset](https://huggingface.co/datasets/nroggendorff/eap) using Supervised Fine-Tuning (SFT) and Teacher Reinforced Learning (TRL) techniques. It is based on the [Mistral 7b Model](mistralai/Mistral-7B-Instruct-v0.3)
23
 
24
  ## Features
25
 
 
28
 
29
  ## Usage
30
 
31
+ To use the LLM, you can load the model using the Hugging Face Transformers library:
32
 
33
  ```python
34
  from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig