mbart_ruDialogSum / README.md
Kirill Gelvan
Update README.md
768a9dc
|
raw
history blame
1.07 kB
---
language:
- ru
- ru-RU
tags:
- t5
inference:
parameters:
no_repeat_ngram_size: 4
datasets:
- samsum
widget:
- text: |
Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
Philipp: Sure you can use the new Hugging Face Deep Learning Container.
Jeff: ok.
Jeff: and how can I get started?
Jeff: where can I find documentation?
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
model-index:
- name: mbart_ruDialogSum
results:
- task:
name: Abstractive Dialogue Summarization
type: abstractive-text-summarization
dataset:
name: "SAMSum Corpus (translated to Russian)"
type: samsum
metrics:
- name: Validation ROGUE-1
type: rogue-1
value: 30
- name: Validation ROGUE-L
type: rogue-l
value: 30
- name: Test ROGUE-1
type: rogue-1
value: 31
- name: Test ROGUE-L
type: rogue-l
value: 31
---
### 📝 Description