Update README.md
Browse files
README.md
CHANGED
@@ -42,18 +42,8 @@ inference:
|
|
42 |
|
43 |
|
44 |
You can use this model directly with a pipeline for masked language modeling:
|
45 |
-
```python
|
46 |
-
from transformers import pipeline
|
47 |
-
# 1. load the model with the huggingface `pipeline`
|
48 |
-
genius = pipeline("text2text-generation", model='beyond/genius-large', device=0)
|
49 |
-
# 2. provide a sketch (joint by mask tokens)
|
50 |
-
sketch = "your_sketch"
|
51 |
-
# 3. here we go!
|
52 |
-
generated_text = genius(sketch, num_beams=3, do_sample=True, max_length=200)[0]['generated_text']
|
53 |
-
print(generated_text)
|
54 |
-
```
|
55 |
|
56 |
-
```
|
57 |
from transformers import pipeline
|
58 |
# 1. load the model with the huggingface `pipeline`
|
59 |
genius = pipeline("text2text-generation", model='beyond/genius-large', device=0)
|
|
|
42 |
|
43 |
|
44 |
You can use this model directly with a pipeline for masked language modeling:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
45 |
|
46 |
+
```python
|
47 |
from transformers import pipeline
|
48 |
# 1. load the model with the huggingface `pipeline`
|
49 |
genius = pipeline("text2text-generation", model='beyond/genius-large', device=0)
|