File size: 484 Bytes
8e90e4a
 
 
e56ebdd
 
 
 
1
2
3
4
5
6
7
---
license: apache-2.0
---

# mBERT swedish distileld base model (cased)

This model is a distilled version of [mBERT](https://huggingface.co/bert-base-multilingual-cased). It was distilled using Swedish data, the 2010-2015 portion of the [Swedish Culturomics Gigaword Corpus](https://spraakbanken.gu.se/en/resources/gigaword). The code for the distillation process can be found [here](https://github.com/AddedK/swedish-mbert-distillation/blob/main/azureML/pretrain_distillation.py).