mt5-large_V8901_V89881
This model is a fine-tuned version of emilstabil/mt5-large_V8901 on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.7699
- Rouge1: 27.4218
- Rouge2: 9.9757
- Rougel: 15.034
- Rougelsum: 25.6557
- Gen Len: 542.2616
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 11
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
3.5194 | 2.11 | 500 | 2.0126 | 26.5426 | 9.1387 | 14.0294 | 24.8851 | 547.0 |
2.5168 | 4.21 | 1000 | 1.8798 | 27.1959 | 9.8737 | 14.6794 | 25.5359 | 547.0 |
2.1275 | 6.32 | 1500 | 1.8062 | 27.3601 | 9.9172 | 14.9859 | 25.5846 | 543.2152 |
1.9746 | 8.42 | 2000 | 1.7772 | 27.1321 | 9.8267 | 15.0547 | 25.4467 | 542.2278 |
1.9195 | 10.53 | 2500 | 1.7699 | 27.4218 | 9.9757 | 15.034 | 25.6557 | 542.2616 |
Framework versions
- Transformers 4.30.2
- Pytorch 1.12.1+git7548e2f
- Datasets 2.13.2
- Tokenizers 0.13.3
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.