TitleOS commited on
Commit
9848e61
1 Parent(s): efafb74

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -0
README.md CHANGED
@@ -1,3 +1,39 @@
1
  ---
 
2
  license: other
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: en
3
  license: other
4
+ commercial: no
5
+ inference: false
6
  ---
7
+ # Seahorse-350m
8
+ ## Model description
9
+ This is the first generation of a OPT based model, finetuned on the Orca dataset formatted to the Alpaca style.
10
+
11
+ ## Training data
12
+ - psmathur/alpaca_orca
13
+
14
+
15
+ ### How to use
16
+ You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:
17
+ ```py
18
+ >>> from transformers import pipeline
19
+ >>> generator = pipeline('text-generation', model='TitleOS/Seahorse-350m')
20
+ >>> generator("Tell me about Alpacas.", do_sample=True, min_length=50)
21
+ ```
22
+
23
+ ## Limitations and biases
24
+ Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).
25
+
26
+ ### License
27
+ OPT-350M is licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
28
+
29
+ ### BibTeX entry and citation info
30
+ ```
31
+ @misc{zhang2022opt,
32
+ title={OPT: Open Pre-trained Transformer Language Models},
33
+ author={Susan Zhang and Stephen Roller and Naman Goyal and Mikel Artetxe and Moya Chen and Shuohui Chen and Christopher Dewan and Mona Diab and Xian Li and Xi Victoria Lin and Todor Mihaylov and Myle Ott and Sam Shleifer and Kurt Shuster and Daniel Simig and Punit Singh Koura and Anjali Sridhar and Tianlu Wang and Luke Zettlemoyer},
34
+ year={2022},
35
+ eprint={2205.01068},
36
+ archivePrefix={arXiv},
37
+ primaryClass={cs.CL}
38
+ }
39
+ ```