SER_model_xapiens / README.md
Nugrahasetyaardi's picture
Model save
54815c2 verified
|
raw
history blame
7.77 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: SER_model_xapiens
    results: []

SER_model_xapiens

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5738
  • Accuracy: 0.3041

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 9 1.7645 0.4076
1.7866 2.0 18 1.7565 0.4076
1.7798 3.0 27 1.7455 0.4076
1.7644 4.0 36 1.7316 0.4076
1.7493 5.0 45 1.7171 0.4181
1.7117 6.0 54 1.7175 0.4233
1.7218 7.0 63 1.7061 0.4220
1.6752 8.0 72 1.7257 0.3958
1.6892 9.0 81 1.7137 0.4181
1.6567 10.0 90 1.7058 0.4220
1.6567 11.0 99 1.7144 0.4299
1.6334 12.0 108 1.7087 0.4076
1.6437 13.0 117 1.7158 0.4273
1.6167 14.0 126 1.7296 0.4076
1.5712 15.0 135 1.7449 0.4168
1.5781 16.0 144 1.7459 0.4246
1.5872 17.0 153 1.7217 0.4168
1.5484 18.0 162 1.7471 0.4128
1.4936 19.0 171 1.7770 0.3893
1.4869 20.0 180 1.7677 0.4076
1.4869 21.0 189 1.8222 0.3709
1.4602 22.0 198 1.7785 0.4024
1.4563 23.0 207 1.8191 0.4024
1.3762 24.0 216 1.7934 0.4115
1.3903 25.0 225 1.8111 0.3919
1.3899 26.0 234 1.9607 0.3185
1.3592 27.0 243 1.8800 0.3775
1.3378 28.0 252 1.8500 0.4063
1.3135 29.0 261 1.8792 0.3840
1.2669 30.0 270 1.9259 0.3735
1.2669 31.0 279 2.0078 0.3224
1.2779 32.0 288 1.9653 0.3421
1.2368 33.0 297 1.9823 0.3499
1.2116 34.0 306 1.9602 0.3775
1.1851 35.0 315 2.0362 0.3447
1.1703 36.0 324 2.0193 0.3434
1.1224 37.0 333 2.0363 0.3552
1.1211 38.0 342 2.0817 0.3277
1.0862 39.0 351 2.0775 0.3355
1.0847 40.0 360 2.1078 0.3303
1.0847 41.0 369 2.1608 0.3067
1.0448 42.0 378 2.0996 0.3591
1.0356 43.0 387 2.1416 0.3329
1.008 44.0 396 2.1892 0.3250
1.0018 45.0 405 2.1390 0.3408
0.9606 46.0 414 2.2137 0.3250
0.9489 47.0 423 2.1843 0.3329
0.9404 48.0 432 2.2826 0.2988
0.9483 49.0 441 2.2073 0.3355
0.9101 50.0 450 2.1827 0.3565
0.9101 51.0 459 2.2765 0.3185
0.896 52.0 468 2.2909 0.3080
0.8558 53.0 477 2.2816 0.3014
0.8433 54.0 486 2.2215 0.3342
0.8194 55.0 495 2.2022 0.3565
0.8436 56.0 504 2.2139 0.3630
0.7808 57.0 513 2.2970 0.3054
0.8009 58.0 522 2.3081 0.3329
0.7575 59.0 531 2.3105 0.3119
0.7894 60.0 540 2.3786 0.3028
0.7894 61.0 549 2.3072 0.3185
0.7532 62.0 558 2.3318 0.3014
0.7222 63.0 567 2.3697 0.3172
0.7116 64.0 576 2.4125 0.3198
0.7286 65.0 585 2.3694 0.3119
0.6906 66.0 594 2.3974 0.3028
0.6642 67.0 603 2.4020 0.3001
0.6737 68.0 612 2.3513 0.3316
0.6571 69.0 621 2.3638 0.3447
0.6482 70.0 630 2.4507 0.3014
0.6482 71.0 639 2.4796 0.2831
0.6407 72.0 648 2.4475 0.3119
0.6198 73.0 657 2.5029 0.2883
0.6087 74.0 666 2.4490 0.3119
0.6074 75.0 675 2.4068 0.3316
0.5941 76.0 684 2.4878 0.3106
0.5922 77.0 693 2.5032 0.2988
0.6028 78.0 702 2.4904 0.3054
0.5867 79.0 711 2.4592 0.3224
0.5677 80.0 720 2.5277 0.2936
0.5677 81.0 729 2.4852 0.3211
0.5459 82.0 738 2.5210 0.3014
0.5346 83.0 747 2.5432 0.3001
0.5572 84.0 756 2.4974 0.3172
0.5246 85.0 765 2.5649 0.3001
0.5163 86.0 774 2.5256 0.3198
0.5231 87.0 783 2.5010 0.3224
0.5141 88.0 792 2.5648 0.2975
0.5014 89.0 801 2.5405 0.3080
0.5125 90.0 810 2.5851 0.2988
0.5125 91.0 819 2.5525 0.3001
0.4922 92.0 828 2.5313 0.3263
0.5109 93.0 837 2.5660 0.3028
0.4847 94.0 846 2.5406 0.3132
0.4702 95.0 855 2.5681 0.3028
0.4929 96.0 864 2.5796 0.2988
0.4653 97.0 873 2.5813 0.2962
0.4866 98.0 882 2.5720 0.3028
0.4591 99.0 891 2.5717 0.3054
0.4891 100.0 900 2.5738 0.3041

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.3.0
  • Datasets 2.19.2.dev0
  • Tokenizers 0.19.1