File size: 390 Bytes
f762bf8
4928fa2
f762bf8
4928fa2
f762bf8
4928fa2
f762bf8
4928fa2
f762bf8
4928fa2
f762bf8
1
2
3
4
5
6
7
8
9
10
11
Based ob the paper: "UmlsBERT: Augmenting Contextual Embeddings with a Clinical Metathesaurus" (https://aclanthology.org/2021.naacl-main.139.pdf).

and the github repo: https://github.com/gmichalo/UmlsBERT

BERT base model.

Trained from scratch on MIMIC dataset, using the UMLS dataset to mask words within the text.

We achived better accuracy on MedNLI dataset.

Bert Model accuracy: 83%