FalconLLM commited on
Commit
8281cc0
β€’
1 Parent(s): 7f5eb0f

Update citation info

Browse files
Files changed (1) hide show
  1. README.md +23 -1
README.md CHANGED
@@ -182,7 +182,29 @@ Falcon-7B-Instruct was trained a custom distributed training codebase, Gigatron.
182
 
183
  ## Citation
184
 
185
- *Paper coming soon 😊.*
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
186
 
187
  ## License
188
 
 
182
 
183
  ## Citation
184
 
185
+ *Paper coming soon* 😊. In the meanwhile, you can use the following information to cite:
186
+ ```
187
+ @article{falcon40b,
188
+ title={{Falcon-40B}: an open large language model with state-of-the-art performance},
189
+ author={Almazrouei, Ebtesam and Alobeidli, Hamza and Alshamsi, Abdulaziz and Cappelli, Alessandro and Cojocaru, Ruxandra and Debbah, Merouane and Goffinet, Etienne and Heslow, Daniel and Launay, Julien and Malartic, Quentin and Noune, Badreddine and Pannier, Baptiste and Penedo, Guilherme},
190
+ year={2023}
191
+ }
192
+ ```
193
+
194
+ To learn more about the pretraining dataset, see the πŸ““ [RefinedWeb paper](https://arxiv.org/abs/2306.01116).
195
+
196
+ ```
197
+ @article{refinedweb,
198
+ title={The {R}efined{W}eb dataset for {F}alcon {LLM}: outperforming curated corpora with web data, and web data only},
199
+ author={Guilherme Penedo and Quentin Malartic and Daniel Hesslow and Ruxandra Cojocaru and Alessandro Cappelli and Hamza Alobeidli and Baptiste Pannier and Ebtesam Almazrouei and Julien Launay},
200
+ journal={arXiv preprint arXiv:2306.01116},
201
+ eprint={2306.01116},
202
+ eprinttype = {arXiv},
203
+ url={https://arxiv.org/abs/2306.01116},
204
+ year={2023}
205
+ }
206
+ ```
207
+
208
 
209
  ## License
210