|
--- |
|
language: |
|
- en |
|
widget: |
|
- text: "My name is Clara and I live in Berkeley, California." |
|
datasets: |
|
- conll2003 |
|
- wnut_17 |
|
- jnlpba |
|
- conll2012 |
|
- BTC |
|
tags: |
|
- PyTorch |
|
--- |
|
|
|
# BERT base uncased model pre-trained on 5 NER datasets |
|
|
|
Model was trained by [SberIDP](https://github.com) |
|
* Task: `NER` |
|
* Training Data is 5 datasets: CoNLL-2003, WNUT17, JNLPBA, CoNLL-2012 (OntoNotes), BTC |
|
|
|
|
|
The model is described [in this article](https://habr.com/ru/company/sberbank/blog/). |
|
|
|
It is pretrained for NER task using [Reptile](https://openai.com/blog/reptile/) and can be finetuned for new entities with only a small amount of samples. |
|
|
|
|