whaleloops
commited on
Commit
•
271a825
1
Parent(s):
3feaf03
Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,10 @@ The learning rate was 5e-5, weight decay was 0.01, adam epsilon was 1e-5.
|
|
18 |
|
19 |
### Usage
|
20 |
|
21 |
-
|
|
|
|
|
|
|
22 |
```
|
23 |
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
24 |
tokenizer = AutoTokenizer.from_pretrained("whaleloops/KEPTlongformer-PMM3")
|
|
|
18 |
|
19 |
### Usage
|
20 |
|
21 |
+
Try the following sentence with Fill-Mask task on the right. The sentence masks token "cardiac".
|
22 |
+
74F with HTN, HLD, DM2, newly diagnosed atrial fibrillation in October who was transferred to hospital for <mask> catheterization after presentation there with syncopal episode.
|
23 |
+
|
24 |
+
Or load the model directly from Transformers:
|
25 |
```
|
26 |
from transformers import AutoTokenizer, AutoModelForMaskedLM
|
27 |
tokenizer = AutoTokenizer.from_pretrained("whaleloops/KEPTlongformer-PMM3")
|