Citation info
Browse files
README.md
CHANGED
@@ -1,5 +1,6 @@
|
|
1 |
---
|
2 |
language: sv
|
|
|
3 |
---
|
4 |
|
5 |
# Swedish BERT Models
|
@@ -119,3 +120,16 @@ model = AutoModel.from_pretrained('KBLab/albert-base-swedish-cased-alpha')
|
|
119 |
- Model pretraining was made partly in-house at the KBLab and partly (for material without active copyright) with the support of Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
|
120 |
- Models are hosted on S3 by Huggingface 🤗
|
121 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
language: sv
|
3 |
+
arxiv: https://arxiv.org/abs/2007.01658
|
4 |
---
|
5 |
|
6 |
# Swedish BERT Models
|
|
|
120 |
- Model pretraining was made partly in-house at the KBLab and partly (for material without active copyright) with the support of Cloud TPUs from Google's TensorFlow Research Cloud (TFRC).
|
121 |
- Models are hosted on S3 by Huggingface 🤗
|
122 |
|
123 |
+
|
124 |
+
## Citation
|
125 |
+
|
126 |
+
https://arxiv.org/abs/2007.01658
|
127 |
+
|
128 |
+
@misc{malmsten2020playing,
|
129 |
+
title={Playing with Words at the National Library of Sweden -- Making a Swedish BERT},
|
130 |
+
author={Martin Malmsten and Love Börjeson and Chris Haffenden},
|
131 |
+
year={2020},
|
132 |
+
eprint={2007.01658},
|
133 |
+
archivePrefix={arXiv},
|
134 |
+
primaryClass={cs.CL}
|
135 |
+
}
|