crisistransformers/CT-M1-BestLoss
Fill-Mask
•
Updated
•
37
Crisis informatics
CrisisTransformers is a family of pre-trained language models and sentence encoders introduced in the papers "CrisisTransformers: Pre-trained language models and sentence encoders for crisis-related social media texts" and "Semantically Enriched Cross-Lingual Sentence Embeddings for Crisis-related Social Media Texts".
The models were trained on a massive corpus of over 15 billion word tokens from tweets associated with 30+ crisis events, such as disease outbreaks, natural disasters, conflicts, etc.