|
--- |
|
|
|
tags: |
|
- token-classification |
|
datasets: |
|
- djagatiya/ner-ontonotes-v5-eng-v4 |
|
widget: |
|
- text: "On September 1st George won 1 dollar while watching Game of Thrones." |
|
--- |
|
|
|
# (NER) ALBERT-base-v2 : conll2012_ontonotesv5-english-v4 |
|
|
|
This `ALBERT-base-v2` NER model was finetuned on `conll2012_ontonotesv5` version `english-v4` dataset. <br> |
|
Check out [NER-System Repository](https://github.com/djagatiya/NER-System) for more information. |
|
|
|
## Evaluation |
|
- Precision: 86.20 |
|
- Recall: 86.18 |
|
- F1-Score: 86.19 |
|
|
|
> check out this [eval.log](eval.log) file for evaluation metrics and classification report. |
|
|
|
``` |
|
precision recall f1-score support |
|
|
|
CARDINAL 0.84 0.83 0.83 935 |
|
DATE 0.84 0.87 0.86 1602 |
|
EVENT 0.61 0.52 0.56 63 |
|
FAC 0.54 0.59 0.56 135 |
|
GPE 0.95 0.94 0.95 2240 |
|
LANGUAGE 0.85 0.50 0.63 22 |
|
LAW 0.56 0.57 0.57 40 |
|
LOC 0.61 0.65 0.63 179 |
|
MONEY 0.85 0.88 0.86 314 |
|
NORP 0.88 0.92 0.90 841 |
|
ORDINAL 0.78 0.86 0.81 195 |
|
ORG 0.84 0.81 0.82 1795 |
|
PERCENT 0.88 0.87 0.88 349 |
|
PERSON 0.94 0.92 0.93 1988 |
|
PRODUCT 0.57 0.53 0.55 76 |
|
QUANTITY 0.77 0.81 0.79 105 |
|
TIME 0.59 0.66 0.62 212 |
|
WORK_OF_ART 0.60 0.52 0.56 166 |
|
|
|
micro avg 0.86 0.86 0.86 11257 |
|
macro avg 0.75 0.74 0.74 11257 |
|
weighted avg 0.86 0.86 0.86 11257 |
|
``` |