File size: 1,565 Bytes
8aa5c26 1a1b027 8aa5c26 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
---
license: cc-by-sa-4.0
language:
- de
tags:
- text complexity
---
# Model Card for DistilBERT German Text Complexity
This model is version of [distilbert-base-german-cased](https://huggingface.co/distilbert-base-german-cased) fine-tuned for text complexity prediction on a scale between 1 and 7.
### Direct Use
To use this model, use our [eval_distilbert.py](https://github.com/MiriUll/text_complexity/blob/master/eval_distilbert.py) script.
## Training Details
The model is a fine-tuned version of the [distilbert-base-german-cased](https://huggingface.co/distilbert-base-german-cased) and a contribution to the GermEval 2022 shared task on text complexity prediction.
It was fine-tuned on the dataset by [Naderi et al, 2019](https://arxiv.org/abs/1904.07733).
For further details, visit our [KONVENS paper](https://aclanthology.org/2022.germeval-1.4/).
## Citation
Please cite our [INLG 2023 paper](https://arxiv.org/abs/2307.13989), if you use our model.
**BibTeX:**
```bibtex
@inproceedings{anschutz-groh-2022-tum,
title = "{TUM} Social Computing at {G}erm{E}val 2022: Towards the Significance of Text Statistics and Neural Embeddings in Text Complexity Prediction",
author = {Ansch{\"u}tz, Miriam and
Groh, Georg},
booktitle = "Proceedings of the GermEval 2022 Workshop on Text Complexity Assessment of German Text",
month = sep,
year = "2022",
address = "Potsdam, Germany",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.germeval-1.4",
pages = "21--26",
}
|