robinq's picture
Update README.md
ae0c98e
|
raw
history blame
858 Bytes
metadata
language:
  - sv

Megatron-BERT-base Swedish 125k

This BERT model was trained using the Megatron-LM library. The size of the model is a regular BERT-base with 110M parameters. The model was trained on about 70GB of data, consisting mostly of OSCAR and Swedish newspaper text curated by the National Library of Sweden.

Training was done for 125k training steps. Its sister model used the same setup, but was instead trained for 600k steps.

The model has three sister models trained on the same dataset: