|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
library_name: sentence-transformers |
|
tags: |
|
- entity |
|
- entity coreference |
|
- Wikipedia |
|
- newspaper |
|
- news |
|
--- |
|
|
|
This model was contrastively trained for entity coreference on a dataset constructed by mentions of the same entity. The model requires text with entities detected via NER and focuses specifically on Person [PER] tags. |
|
|
|
We start with a base S-BERT MPNet bi-encoder model (18). This is constrastively trained on 179 million pairs taken from mentions of entities on Wikipedia, where positives are mentions of the same individual. Hard negatives are mined using individuals that appear on the same disambiguation pages. Embeddings from the tuned co-reference resolution model are then clustered using Hierarchical Agglomerative Clustering. |
|
|
|
More information about its training (and use) can be found on the associated code [repo](https://github.com/dell-research-harvard/newswire/tree/main) and [paper](https://arxiv.org/pdf/2406.09490). |
|
|
|
|
|
|