File size: 651 Bytes
d8bb207
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
language:
- en
datasets:
- conll2003
- wnut_17
- jnlpba
- conll2012
- BTC
tags:
- PyTorch
---

# BERT base uncased model pre-trained on 5 NER datasets

Model was trained by [SberIDP](https://github.com)
* Task: `NER`
* Training Data: 5 datasets:                                                                                               
- CoNLL-2003, 
- WNUT17
- JNLPBA
- CoNLL-2012 (OntoNotes)
- BTC


The model is described [in this article](https://habr.com/ru/company/sberbank/blog/).
It is pretrained for NER task using Reptile(https://openai.com/blog/reptile/) and can be finetuned for new entities with only a small amount of samples.