SER_model_xapiens / README.md
Nugrahasetyaardi's picture
Model save
e74a6c0 verified
|
raw
history blame
5.83 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: SER_model_xapiens
    results: []

SER_model_xapiens

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8673
  • Accuracy: 0.7

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 0.9329 0.6667
No log 2.0 3 0.9100 0.7167
No log 3.0 5 0.9221 0.7333
No log 4.0 6 0.9121 0.7333
No log 5.0 7 0.9391 0.6833
No log 6.0 9 0.9593 0.6667
0.3147 7.0 11 0.9780 0.65
0.3147 8.0 12 0.9659 0.6667
0.3147 9.0 13 0.9612 0.6833
0.3147 10.0 15 0.9492 0.7333
0.3147 11.0 17 0.9959 0.6667
0.3147 12.0 18 0.9615 0.6167
0.3147 13.0 19 0.9160 0.65
0.2707 14.0 21 0.9024 0.6833
0.2707 15.0 23 0.9288 0.6667
0.2707 16.0 24 0.8897 0.7
0.2707 17.0 25 0.9008 0.6833
0.2707 18.0 27 0.9367 0.6833
0.2707 19.0 29 0.8923 0.7667
0.2198 20.0 30 0.8834 0.7833
0.2198 21.0 31 0.8932 0.7667
0.2198 22.0 33 0.8992 0.7167
0.2198 23.0 35 0.9127 0.7
0.2198 24.0 36 0.9262 0.6667
0.2198 25.0 37 0.9023 0.7333
0.2198 26.0 39 0.8684 0.75
0.1802 27.0 41 0.9270 0.6833
0.1802 28.0 42 0.9361 0.6833
0.1802 29.0 43 0.8819 0.7333
0.1802 30.0 45 0.8361 0.75
0.1802 31.0 47 0.8555 0.6833
0.1802 32.0 48 0.8470 0.7
0.1802 33.0 49 0.8418 0.75
0.1515 34.0 51 0.8077 0.7667
0.1515 35.0 53 0.8207 0.75
0.1515 36.0 54 0.8540 0.7333
0.1515 37.0 55 0.8494 0.7167
0.1515 38.0 57 0.8525 0.75
0.1515 39.0 59 0.8675 0.7167
0.1313 40.0 60 0.8809 0.7167
0.1313 41.0 61 0.8739 0.7333
0.1313 42.0 63 0.8957 0.6833
0.1313 43.0 65 0.9231 0.6833
0.1313 44.0 66 0.9400 0.6833
0.1313 45.0 67 0.9518 0.7
0.1313 46.0 69 0.9820 0.65
0.1179 47.0 71 0.9966 0.65
0.1179 48.0 72 0.9788 0.6667
0.1179 49.0 73 0.9537 0.6833
0.1179 50.0 75 0.9152 0.7
0.1179 51.0 77 0.8896 0.6833
0.1179 52.0 78 0.8610 0.6667
0.1179 53.0 79 0.8374 0.7
0.1049 54.0 81 0.8419 0.7333
0.1049 55.0 83 0.8516 0.6667
0.1049 56.0 84 0.8241 0.7
0.1049 57.0 85 0.8027 0.7167
0.1049 58.0 87 0.8049 0.75
0.1049 59.0 89 0.8238 0.75
0.0973 60.0 90 0.8284 0.7167
0.0973 61.0 91 0.8325 0.7
0.0973 62.0 93 0.8268 0.7333
0.0973 63.0 95 0.8333 0.7333
0.0973 64.0 96 0.8424 0.7333
0.0973 65.0 97 0.8505 0.7167
0.0973 66.0 99 0.8644 0.7
0.0935 66.6667 100 0.8673 0.7

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.3.0
  • Datasets 2.19.2.dev0
  • Tokenizers 0.19.1