Edit model card

009-microsoft-deberta-v3-base-finetuned-yahoo-800_200

This model is a fine-tuned version of microsoft/deberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1599
  • F1: 0.6588
  • Accuracy: 0.66
  • Precision: 0.6659
  • Recall: 0.66
  • System Ram Used: 5.0546
  • System Ram Total: 83.4807
  • Gpu Ram Allocated: 4.1727
  • Gpu Ram Cached: 26.7715
  • Gpu Ram Total: 39.5640
  • Gpu Utilization: 56
  • Disk Space Used: 40.6642
  • Disk Space Total: 78.1898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss F1 Accuracy Precision Recall System Ram Used System Ram Total Gpu Ram Allocated Gpu Ram Cached Gpu Ram Total Gpu Utilization Disk Space Used Disk Space Total
2.3022 0.76 19 2.3012 0.0182 0.1 0.01 0.1 4.4456 83.4807 4.1727 26.7598 39.5640 45 33.7570 78.1898
2.2979 1.52 38 2.2854 0.0635 0.155 0.0449 0.155 5.0347 83.4807 4.1727 26.7715 39.5640 43 38.5922 78.1898
2.2316 2.28 57 2.1098 0.2285 0.305 0.2806 0.305 5.1781 83.4807 4.1727 26.7715 39.5640 44 40.6639 78.1898
1.9915 3.04 76 1.8477 0.4148 0.43 0.5040 0.43 5.1741 83.4807 4.1727 26.7715 39.5640 50 40.6639 78.1898
1.684 3.8 95 1.6027 0.5272 0.55 0.5666 0.55 5.1766 83.4807 4.1728 26.7715 39.5640 47 40.6639 78.1898
1.3911 4.56 114 1.4365 0.6060 0.615 0.6199 0.615 5.1746 83.4807 4.1728 26.7715 39.5640 49 40.6640 78.1898
1.1477 5.32 133 1.2565 0.6215 0.615 0.6419 0.615 5.1586 83.4807 4.1728 26.7715 39.5640 52 40.6640 78.1898
0.9198 6.08 152 1.1759 0.6400 0.64 0.6532 0.64 5.1810 83.4807 4.1727 26.7715 39.5640 55 40.6640 78.1898
0.7605 6.84 171 1.1128 0.6418 0.645 0.6564 0.645 5.1415 83.4807 4.1727 26.7715 39.5640 45 40.6640 78.1898
0.6093 7.6 190 1.0767 0.6678 0.67 0.6758 0.67 5.1347 83.4807 4.1728 26.7715 39.5640 43 40.6640 78.1898
0.5111 8.36 209 1.1033 0.6552 0.655 0.6742 0.655 5.1206 83.4807 4.1728 26.7715 39.5640 52 40.6641 78.1898
0.3828 9.12 228 1.1063 0.6875 0.69 0.6927 0.69 5.1484 83.4807 4.1727 26.7715 39.5640 44 40.6641 78.1898
0.3082 9.88 247 1.1240 0.6573 0.665 0.6595 0.665 5.1437 83.4807 4.1728 26.7715 39.5640 45 40.6641 78.1898
0.2716 10.64 266 1.1572 0.6604 0.665 0.6665 0.665 5.0689 83.4807 4.1728 26.7715 39.5640 45 40.6641 78.1898
0.2442 11.4 285 1.1058 0.6765 0.675 0.6827 0.675 5.0316 83.4807 4.1728 26.7715 39.5640 42 40.6641 78.1898
0.1791 12.16 304 1.1455 0.6445 0.645 0.6515 0.645 5.0715 83.4807 4.1728 26.7715 39.5640 46 40.6641 78.1898
0.1604 12.92 323 1.1514 0.6578 0.66 0.6686 0.66 5.0728 83.4807 4.1728 26.7715 39.5640 57 40.6641 78.1898
0.1389 13.68 342 1.1600 0.6715 0.675 0.6808 0.675 5.0655 83.4807 4.1727 26.7715 39.5640 48 40.6642 78.1898
0.151 14.44 361 1.1573 0.6626 0.665 0.6687 0.665 5.0588 83.4807 4.1727 26.7715 39.5640 48 40.6642 78.1898

Framework versions

  • Transformers 4.31.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for diogopaes10/009-microsoft-deberta-v3-base-finetuned-yahoo-800_200

Finetuned
(226)
this model