|
# Pangu-Alpha 2.6B |
|
|
|
## Usage |
|
|
|
Currently Pangu model is not supported by transformers, |
|
so `trust_remote_code=True` is required to execute custom model. |
|
|
|
```python |
|
from transformers import TextGenerationPipeline, AutoTokenizer, AutoModelForCausalLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("imone/pangu_2.6B", trust_remote_code=True) |
|
model = AutoModelForCausalLM.from_pretrained("imone/pangu_2.6B", trust_remote_code=True) |
|
|
|
text_generator = TextGenerationPipeline(model, tokenizer) |
|
text_generator("ä¸å›½å’Œç¾Žå›½å’Œæ—¥æœ¬å’Œæ³•å›½å’ŒåŠ 拿大和澳大利亚的首都分别是哪里?") |
|
``` |