Edit model card

ERNIE-3.0-base-zh

Introduction

ERNIE 3.0: Large-scale Knowledge Enhanced Pre-training for Language Understanding and Generation More detail: https://arxiv.org/abs/2107.02137

Released Model Info

This released pytorch model is converted from the officially released PaddlePaddle ERNIE model and a series of experiments have been conducted to check the accuracy of the conversion.

How to use

Then you can load ERNIE-3.0 model as before:

from transformers import BertTokenizer, ErnieForMaskedLM

tokenizer = BertTokenizer.from_pretrained("nghuyong/ernie-3.0-base-zh")
model = ErnieForMaskedLM.from_pretrained("nghuyong/ernie-3.0-base-zh")

Citation

@article{sun2021ernie,
  title={Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation},
  author={Sun, Yu and Wang, Shuohuan and Feng, Shikun and Ding, Siyu and Pang, Chao and Shang, Junyuan and Liu, Jiaxiang and Chen, Xuyi and Zhao, Yanbin and Lu, Yuxiang and others},
  journal={arXiv preprint arXiv:2107.02137},
  year={2021}
}
Downloads last month
3,703
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nghuyong/ernie-3.0-base-zh

Finetunes
1 model
Quantizations
1 model

Spaces using nghuyong/ernie-3.0-base-zh 82