File size: 589 Bytes
57383e9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
# Model summary
Train flan-T5-large on alpaca dataset with LoRA
# training
* torch==2.0.0+cu117
* transformers==4.28.0.dev0
* 8 x V100 32G
# How to use
```python
import transformers
from peft import PeftModel
base_model = transformers.AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-large")
peft_model = PeftModel.from_pretrained("zirui3/flan-t5-large-alpaca")
inputs = tokenizer("Any instruction that you like.", return_tensors="pt")
outputs = peft_model.generate(**inputs, max_length=128, do_sample=True)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True)
``` |