Text Generation
Transformers
PyTorch
English
llama
text-generation-inference
Inference Endpoints
hamishivi commited on
Commit
71aba8a
1 Parent(s): 0b4d96e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -18,7 +18,10 @@ base_model: meta-llama/Llama-2-7b-hf
18
  This model belongs to the Tulu series of models, which is a series of language models that are trained to act as helpful assistants.
19
  Open Instruct ShareGPT Llama2 7B is initially fine-tuned version of Llama 2 that was trained on the [ShareGPT dataset](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered).
20
  The model was then further trained on the UltraFeedback dataset using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290).
21
- Please check out our paper [TODO] for more!
 
 
 
22
 
23
 
24
  ## Model description
@@ -106,12 +109,13 @@ The following hyperparameters were used during DPO training:
106
  If you find this model is useful in your work, please cite it with:
107
 
108
  ```
109
- @misc{ivison2023changing,
110
- title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
111
- author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
112
- year={2023},
113
- archivePrefix={arXiv},
114
- primaryClass={cs.CL}
 
115
  }
116
  ```
117
 
 
18
  This model belongs to the Tulu series of models, which is a series of language models that are trained to act as helpful assistants.
19
  Open Instruct ShareGPT Llama2 7B is initially fine-tuned version of Llama 2 that was trained on the [ShareGPT dataset](https://huggingface.co/datasets/anon8231489123/ShareGPT_Vicuna_unfiltered).
20
  The model was then further trained on the UltraFeedback dataset using [Direct Preference Optimization (DPO)](https://arxiv.org/abs/2305.18290).
21
+
22
+ For more details, read the paper: [Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2
23
+ ](https://arxiv.org/abs/2311.10702).
24
+
25
 
26
 
27
  ## Model description
 
109
  If you find this model is useful in your work, please cite it with:
110
 
111
  ```
112
+ @misc{ivison2023camels,
113
+ title={Camels in a Changing Climate: Enhancing LM Adaptation with Tulu 2},
114
+ author={Hamish Ivison and Yizhong Wang and Valentina Pyatkin and Nathan Lambert and Matthew Peters and Pradeep Dasigi and Joel Jang and David Wadden and Noah A. Smith and Iz Beltagy and Hannaneh Hajishirzi},
115
+ year={2023},
116
+ eprint={2311.10702},
117
+ archivePrefix={arXiv},
118
+ primaryClass={cs.CL}
119
  }
120
  ```
121