Edit model card

t5-base-finetuned-scitldr

This model is a fine-tuned version of t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.1055
  • Rouge1: 23.6222
  • Rouge2: 10.2432
  • Rougel: 19.702
  • Rougelsum: 20.9458
  • Gen Len: 18.979

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
2.4272 0.1 100 3.1951 23.0447 9.7818 19.0676 20.1677 18.9532
2.0362 0.2 200 3.1715 23.5443 10.1156 19.5788 20.6995 18.9483
2.188 0.3 300 3.1067 24.2387 10.3059 20.0964 21.2592 18.9338
2.0312 0.4 400 3.1092 23.3168 10.1308 19.4275 20.611 18.9742
2.012 0.5 500 3.1189 23.6989 10.3005 19.7634 20.9462 18.9758
2.0581 0.6 600 3.1191 23.6818 10.2636 19.7953 20.9935 18.9774
2.0067 0.7 700 3.1297 23.8476 10.5139 19.9696 21.1594 18.9774
2.0049 0.8 800 3.1150 23.6929 10.3243 19.7895 21.0455 18.979
2.1839 0.9 900 3.1055 23.6222 10.2432 19.702 20.9458 18.979

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
1
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for witchling22/t5-base-finetuned-scitldr

Base model

google-t5/t5-base
Finetuned
(391)
this model