Commit
•
fe01255
1
Parent(s):
eaa3900
Update README.md (#2)
Browse files- Update README.md (4a08d86c60a00e24a2dde7c9b3ebc68a1d177d8a)
Co-authored-by: doyoung kim <[email protected]>
README.md
CHANGED
@@ -13,16 +13,17 @@ Our overall explanation models along with ablations can be found in our [paper](
|
|
13 |
|-|-|
|
14 |
|[Flipped_11B](https://huggingface.co/seonghyeonye/flipped_11B)|11 billion|
|
15 |
|[Flipped_3B](https://huggingface.co/seonghyeonye/flipped_3B)|3 billion|
|
16 |
-
Here is how to
|
|
|
17 |
```python
|
18 |
-
|
19 |
-
|
20 |
-
|
21 |
-
|
22 |
-
|
23 |
-
print(tokenizer.decode(outputs[0]))
|
24 |
```
|
25 |
-
If you want to use another checkpoint, please replace the path in `
|
|
|
26 |
**Note: the model was trained with fp32 activations. As such, we highly discourage running inference with fp16.**
|
27 |
|
28 |
# Training procedure
|
|
|
13 |
|-|-|
|
14 |
|[Flipped_11B](https://huggingface.co/seonghyeonye/flipped_11B)|11 billion|
|
15 |
|[Flipped_3B](https://huggingface.co/seonghyeonye/flipped_3B)|3 billion|
|
16 |
+
Here is how to download the model in PyTorch:
|
17 |
+
|
18 |
```python
|
19 |
+
import torch
|
20 |
+
from transformers import T5Tokenizer, T5ForConditionalGeneration
|
21 |
+
|
22 |
+
model = T5ForConditionalGeneration.from_pretrained("seonghyeonye/flipped_11B")
|
23 |
+
tokenizer = T5Tokenizer.from_pretrained("seonghyeonye/flipped_11B")
|
|
|
24 |
```
|
25 |
+
If you want to use another checkpoint, please replace the path in `T5Tokenizer` and `T5ForConditionalGeneration`.
|
26 |
+
We also provide a quick [Jupyter Notebook](https://github.com/seonghyeonye/Flipped-Learning/blob/master/flipped_inference.ipynb) where you can inference with our method.
|
27 |
**Note: the model was trained with fp32 activations. As such, we highly discourage running inference with fp16.**
|
28 |
|
29 |
# Training procedure
|