File size: 2,021 Bytes
7f8255f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
license: mit
language:
- en
- de
- fr
- fi
- sv
- nl
---
# hmByT5 - Preliminary Language Models
Preliminary Historic Multilingual and Monolingual ByT5 Models. Following languages are currently covered:
* English (British Library Corpus - Books)
* German (Europeana Newspaper)
* French (Europeana Newspaper)
* Finnish (Europeana Newspaper)
* Swedish (Europeana Newspaper)
* Dutch (Delpher Corpus)
More details can be found in [our GitHub repository](https://github.com/stefan-it/hmByT5).
In this experiment we sample 4B bytes (~4GB of text) from each corpora (and upsample Swedish and Finnish).
# Pretraining
We use the official JAX/FLAX example in Hugging Face Transformers to pretrain a ByT5 model on a single v3-8 TPU.
Details about the training can be found [here](https://github.com/stefan-it/hmByT5/tree/main/hmbyt5-flax).
# Evaluation on Downstream Tasks (NER)
We evaluated the hmByT5 model on downstream tasks:
| Model | English AjMC | German AjMC | French AjMC | Finnish NewsEye | Swedish NewsEye | Dutch ICDAR | French ICDAR | Avg. |
|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|--------------|--------------|-----------------|-----------------|--------------|--------------|------|
| [`hmbyt5-preliminary/byt5-small-multilingual-4g`](https://huggingface.co/hmbyt5-preliminary/byt5-small-multilingual-4g) | 83.49 ± 0.96 | 87.65 ± 0.63 | 84.16 ± 0.90 | | | | | |
# Acknowledgements
Research supported with Cloud TPUs from Google's [TPU Research Cloud](https://sites.research.google/trc/about/) (TRC).
Many Thanks for providing access to the TPUs ❤️
|