metadata
license: apache-2.0
base_model: distilroberta-base
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: distilroberta-topic-classification_5
results: []
distilroberta-topic-classification_5
This model is a fine-tuned version of distilroberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.7686
- F1: 0.6337
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 12345
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 16
- num_epochs: 10
- mixed_precision_training: Native AMP
- label_smoothing_factor: 0.2
Training results
Training Loss | Epoch | Step | Validation Loss | F1 |
---|---|---|---|---|
2.6113 | 1.0 | 1305 | 2.6631 | 0.5832 |
2.4032 | 2.0 | 2610 | 2.6335 | 0.5943 |
2.3245 | 3.0 | 3915 | 2.6132 | 0.6196 |
2.2142 | 4.0 | 5220 | 2.6438 | 0.6226 |
2.0364 | 5.0 | 6525 | 2.6559 | 0.6323 |
2.03 | 6.0 | 7830 | 2.7057 | 0.6282 |
1.9461 | 7.0 | 9135 | 2.7222 | 0.6325 |
1.8751 | 8.0 | 10440 | 2.7435 | 0.6302 |
1.8463 | 9.0 | 11745 | 2.7668 | 0.6329 |
1.9001 | 10.0 | 13050 | 2.7686 | 0.6337 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0