Edit model card

014-microsoft-deberta-v3-base-finetuned-yahoo-80_20

This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3983
  • F1: 0.2344
  • Accuracy: 0.3
  • Precision: 0.2369
  • Recall: 0.3
  • System Ram Used: 4.9456
  • System Ram Total: 83.4807
  • Gpu Ram Allocated: 4.8430
  • Gpu Ram Cached: 7.0469
  • Gpu Ram Total: 39.5640
  • Gpu Utilization: 11
  • Disk Space Used: 40.4033
  • Disk Space Total: 78.1898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss F1 Accuracy Precision Recall System Ram Used System Ram Total Gpu Ram Allocated Gpu Ram Cached Gpu Ram Total Gpu Utilization Disk Space Used Disk Space Total
2.311 2.5 25 2.3033 0.0857 0.15 0.1105 0.15 5.0008 83.4807 4.8431 7.0469 39.5640 40 36.2797 78.1898
2.2703 5.0 50 2.3011 0.0686 0.2 0.0417 0.2 5.0121 83.4807 4.8431 7.0469 39.5640 38 36.2797 78.1898
2.0062 7.5 75 2.2817 0.0794 0.15 0.0543 0.15 4.9856 83.4807 4.8431 7.0469 39.5640 42 36.2797 78.1898
1.49 10.0 100 2.3281 0.1178 0.2 0.0869 0.2 4.9824 83.4807 4.8431 7.0469 39.5640 37 36.2797 78.1898
0.9424 12.5 125 2.3475 0.1733 0.25 0.1417 0.25 4.9446 83.4807 4.8431 7.0469 39.5640 42 36.2798 78.1898
0.5591 15.0 150 2.4503 0.1744 0.25 0.1452 0.25 4.9201 83.4807 4.8431 7.0469 39.5640 29 36.2798 78.1898
0.2893 17.5 175 2.5557 0.1744 0.25 0.1452 0.25 4.9618 83.4807 4.8431 7.0469 39.5640 43 36.2798 78.1898
0.1623 20.0 200 2.6218 0.2411 0.3 0.2452 0.3 4.9110 83.4807 4.8431 7.0469 39.5640 30 36.2799 78.1898
0.0817 22.5 225 2.7346 0.24 0.3 0.2417 0.3 4.9413 83.4807 4.8431 7.0469 39.5640 38 36.2799 78.1898
0.0475 25.0 250 2.9325 0.2344 0.3 0.2369 0.3 4.9314 83.4807 4.8431 7.0469 39.5640 40 36.2800 78.1898
0.0322 27.5 275 3.1235 0.2511 0.3 0.2869 0.3 4.9336 83.4807 4.8431 7.0469 39.5640 42 36.2800 78.1898
0.0254 30.0 300 3.1455 0.2344 0.3 0.2369 0.3 4.9387 83.4807 4.8431 7.0469 39.5640 30 36.2800 78.1898
0.0195 32.5 325 3.2767 0.2344 0.3 0.2369 0.3 4.9198 83.4807 4.8431 7.0469 39.5640 38 36.2801 78.1898
0.0163 35.0 350 3.3281 0.2344 0.3 0.2369 0.3 4.9709 83.4807 4.8431 7.0469 39.5640 42 40.4031 78.1898
0.015 37.5 375 3.3318 0.2344 0.3 0.2369 0.3 4.9642 83.4807 4.8431 7.0469 39.5640 41 40.4032 78.1898
0.0133 40.0 400 3.3617 0.2511 0.3 0.2869 0.3 4.9608 83.4807 4.8431 7.0469 39.5640 37 40.4032 78.1898
0.0127 42.5 425 3.3788 0.2344 0.3 0.2369 0.3 4.9617 83.4807 4.8431 7.0469 39.5640 39 40.4032 78.1898
0.0129 45.0 450 3.3928 0.2511 0.3 0.2869 0.3 4.9576 83.4807 4.8431 7.0469 39.5640 41 40.4032 78.1898
0.0121 47.5 475 3.3897 0.2344 0.3 0.2369 0.3 4.9421 83.4807 4.8431 7.0469 39.5640 37 40.4033 78.1898
0.0124 50.0 500 3.3983 0.2344 0.3 0.2369 0.3 4.9581 83.4807 4.8431 7.0469 39.5640 38 40.4033 78.1898

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
14
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for diogopaes10/014-microsoft-deberta-v3-base-finetuned-yahoo-80_20

Finetuned
this model