|
--- |
|
language: |
|
- zh |
|
tags: |
|
- bert |
|
license: "apache-2.0" |
|
--- |
|
|
|
# Please use 'Bert' related functions to load this model! |
|
|
|
## Chinese small pre-trained model MiniRBT |
|
|
|
In order to further promote the research and development of Chinese information processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology. |
|
|
|
This repository is developed based on:https://github.com/iflytek/MiniRBT |
|
|
|
You may also interested in, |
|
- Chinese LERT: https://github.com/ymcui/LERT |
|
- Chinese PERT: https://github.com/ymcui/PERT |
|
- Chinese MacBERT: https://github.com/ymcui/MacBERT |
|
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA |
|
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet |
|
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer |
|
|
|
More resources by HFL: https://github.com/iflytek/HFL-Anthology |