Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -22,8 +22,8 @@ tags:
|
|
22 |
![halos](https://gist.github.com/assets/29318529/fe2d8391-dbd1-4b7e-9dc4-7cb97e55bc06)
|
23 |
|
24 |
This repo contains the model checkpoints for:
|
25 |
-
- model family <b>
|
26 |
-
- optimized with the loss <b>
|
27 |
- aligned using the SHP, Anthropic HH and Open Assistant datasets.
|
28 |
|
29 |
To prompt Archangel models, ensure that the format is consistent with that of TuluV2.
|
@@ -40,8 +40,7 @@ Chocolate cake.
|
|
40 |
```
|
41 |
Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
|
42 |
|
43 |
-
|
44 |
-
To generate with these control tokens in the context, postpend either to the prompt.
|
45 |
|
46 |
Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
|
47 |
|
|
|
22 |
![halos](https://gist.github.com/assets/29318529/fe2d8391-dbd1-4b7e-9dc4-7cb97e55bc06)
|
23 |
|
24 |
This repo contains the model checkpoints for:
|
25 |
+
- model family <b>EleutherAI/pythia-2.8b</b>
|
26 |
+
- optimized with the loss <b>PPO</b>
|
27 |
- aligned using the SHP, Anthropic HH and Open Assistant datasets.
|
28 |
|
29 |
To prompt Archangel models, ensure that the format is consistent with that of TuluV2.
|
|
|
40 |
```
|
41 |
Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
|
42 |
|
43 |
+
|
|
|
44 |
|
45 |
Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
|
46 |
|