Edit model card

flan-t5-base-cars-descriptions

This model is a fine-tuned version of google/flan-t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0993
  • Rouge1: 19.8484
  • Rouge2: 13.6841
  • Rougel: 17.8819
  • Rougelsum: 19.2489
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 28 1.3129 19.8055 13.244 17.4816 18.9752 19.0
No log 2.0 56 1.1922 20.0308 13.7036 17.7216 19.3773 19.0
No log 3.0 84 1.1337 19.4591 13.5553 17.5737 18.7867 19.0
No log 4.0 112 1.1075 19.8452 13.681 17.8433 19.246 19.0
No log 5.0 140 1.0993 19.8484 13.6841 17.8819 19.2489 19.0

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
5
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for fashxp/flan-t5-base-cars-descriptions

Finetuned
(628)
this model