update(README): add citation
Browse files
README.md
CHANGED
@@ -81,6 +81,7 @@ print(tokenizer.decode(tokens[0], skip_special_tokens=True))
|
|
81 |
* **Developed by**: [Stability AI](https://stability.ai/)
|
82 |
* **Model type**: `Stable LM 2 12B` models are auto-regressive language models based on the transformer decoder architecture.
|
83 |
* **Language(s)**: English
|
|
|
84 |
* **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
|
85 |
* **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-12b/blob/main/LICENSE). If you'd like to use this model for commercial products or purposes, please contact us [here](https://stability.ai/membership) to learn more.
|
86 |
* **Contact**: For questions and comments about the model, please email `[email protected]`
|
@@ -129,10 +130,11 @@ As a base model, this model may exhibit unreliable, unsafe, or other undesirable
|
|
129 |
|
130 |
## How to Cite
|
131 |
|
132 |
-
```
|
133 |
-
@
|
134 |
-
|
135 |
-
|
136 |
-
|
|
|
137 |
}
|
138 |
```
|
|
|
81 |
* **Developed by**: [Stability AI](https://stability.ai/)
|
82 |
* **Model type**: `Stable LM 2 12B` models are auto-regressive language models based on the transformer decoder architecture.
|
83 |
* **Language(s)**: English
|
84 |
+
* **Paper**: [Stable LM 2 Technical Report](https://arxiv.org/abs/2402.17834)
|
85 |
* **Library**: [GPT-NeoX](https://github.com/EleutherAI/gpt-neox)
|
86 |
* **License**: [Stability AI Non-Commercial Research Community License](https://huggingface.co/stabilityai/stablelm-2-12b/blob/main/LICENSE). If you'd like to use this model for commercial products or purposes, please contact us [here](https://stability.ai/membership) to learn more.
|
87 |
* **Contact**: For questions and comments about the model, please email `[email protected]`
|
|
|
130 |
|
131 |
## How to Cite
|
132 |
|
133 |
+
```
|
134 |
+
@article{bellagente2024stable,
|
135 |
+
title={Stable LM 2 1.6 B Technical Report},
|
136 |
+
author={Bellagente, Marco and Tow, Jonathan and Mahan, Dakota and Phung, Duy and Zhuravinskyi, Maksym and Adithyan, Reshinth and Baicoianu, James and Brooks, Ben and Cooper, Nathan and Datta, Ashish and others},
|
137 |
+
journal={arXiv preprint arXiv:2402.17834},
|
138 |
+
year={2024}
|
139 |
}
|
140 |
```
|