Edit model card

DanSumT5-largeV_38143V_15157

This model is a fine-tuned version of emilstabil/DanSumT5-largeV_38143 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9526
  • Rouge1: 36.0906
  • Rouge2: 12.3436
  • Rougel: 22.6262
  • Rougelsum: 33.6191
  • Gen Len: 124.7848

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 11

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 0.99 118 1.9535 36.1763 12.4392 22.5367 33.632 124.7637
No log 2.0 237 1.9579 36.074 12.4948 22.6669 33.5774 124.2954
No log 3.0 356 1.9592 35.8935 12.4362 22.4628 33.4068 124.3713
No log 4.0 475 1.9579 35.9893 12.4292 22.4658 33.622 124.384
1.6658 4.99 593 1.9642 35.9501 12.279 22.4227 33.4503 124.3207
1.6658 6.0 712 1.9598 35.8682 12.3408 22.5165 33.2375 124.4135
1.6658 7.0 831 1.9609 35.6712 12.0964 22.2602 33.2817 124.8776
1.6658 8.0 950 1.9567 35.8782 12.272 22.6389 33.462 124.0084
1.5814 8.99 1068 1.9591 35.888 12.278 22.4367 33.4144 124.2658
1.5814 10.0 1187 1.9544 35.8605 12.2878 22.5468 33.4009 124.4388
1.5814 10.93 1298 1.9526 36.0906 12.3436 22.6262 33.6191 124.7848

Framework versions

  • Transformers 4.30.2
  • Pytorch 1.12.1+git7548e2f
  • Datasets 2.13.2
  • Tokenizers 0.13.3
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.