slippylolo
commited on
Commit
•
54b3714
1
Parent(s):
7ea4d00
Update README.md
Browse files
README.md
CHANGED
@@ -1,40 +1,45 @@
|
|
1 |
-
|
|
|
|
|
|
|
|
|
|
|
2 |
|
3 |
-
|
4 |
-
from transformers import AutoTokenizer, AutoModelForCausalLM
|
5 |
-
import transformers
|
6 |
-
import torch
|
7 |
|
8 |
-
|
9 |
|
|
|
10 |
|
11 |
-
model = AutoModelForCausalLM.from_pretrained(
|
12 |
-
"tiiuae/falcon-micro-self-instruct",
|
13 |
-
trust_remote_code=True,
|
14 |
-
torch_dtype=torch.bfloat16,
|
15 |
-
use_auth_token="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",
|
16 |
-
)
|
17 |
|
18 |
-
|
19 |
-
model.cuda()
|
20 |
|
21 |
-
pipeline = transformers.pipeline("text-generation", model=model, tokenizer=tokenizer, device="cuda:0")
|
22 |
-
sequences = pipeline(
|
23 |
-
"What is your favourite dad joke?",
|
24 |
-
max_length=200,
|
25 |
-
do_sample=True,
|
26 |
-
top_k=10,
|
27 |
-
repetition_penalty=1.2,
|
28 |
-
num_return_sequences=2,
|
29 |
-
eos_token_id=tokenizer.eos_token_id,
|
30 |
-
)
|
31 |
|
32 |
-
|
33 |
-
print(f"Result: {seq['generated_text']}")
|
34 |
|
35 |
-
|
36 |
|
|
|
|
|
|
|
|
|
37 |
|
|
|
38 |
|
39 |
-
|
|
|
40 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
datasets:
|
3 |
+
- tiiuae/falcon-refinedweb
|
4 |
+
language:
|
5 |
+
- en
|
6 |
+
---
|
7 |
|
8 |
+
# Falcon-7B-Instruct
|
|
|
|
|
|
|
9 |
|
10 |
+
**Falcon-7B-Instuct is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae), based on [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b), and finetuned on an ensemble of instruct datasets. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b/blob/main/LICENSE.txt).**
|
11 |
|
12 |
+
More details coming soon.
|
13 |
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
+
# Model Card for Falcon-7B-Instruct
|
|
|
16 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
17 |
|
18 |
+
## Model Details
|
|
|
19 |
|
20 |
+
### Model Description
|
21 |
|
22 |
+
- **Developed by:** [https://www.tii.ae](https://www.tii.ae)
|
23 |
+
- **Model type:** Causal decoder-only
|
24 |
+
- **Language(s) (NLP):** English
|
25 |
+
- **License:** TII Falcon LLM License
|
26 |
|
27 |
+
### Model Source
|
28 |
|
29 |
+
- **Paper:** coming soon
|
30 |
+
- **Demo:** coming soon
|
31 |
|
32 |
+
## Uses
|
33 |
+
|
34 |
+
### Out-of-Scope Use
|
35 |
+
|
36 |
+
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful
|
37 |
+
|
38 |
+
## Bias, Risks, and Limitations
|
39 |
+
|
40 |
+
Falcon-7B-Instruct is trained on English data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online
|
41 |
+
|
42 |
+
|
43 |
+
## Paper
|
44 |
+
|
45 |
+
More details coming soon in the paper.
|