EzraWilliam's picture
End of training
19e357a
|
raw
history blame
3.17 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-base-timit-demo-google-colab-Ezra_William
    results: []

wav2vec2-base-timit-demo-google-colab-Ezra_William

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5198
  • Wer: 0.3335

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
3.4828 1.0 500 1.6354 1.0429
0.8406 2.01 1000 0.5389 0.5405
0.4345 3.01 1500 0.4202 0.4438
0.2912 4.02 2000 0.4195 0.4216
0.2316 5.02 2500 0.4253 0.4051
0.1917 6.02 3000 0.3969 0.3958
0.1545 7.03 3500 0.4291 0.3912
0.1423 8.03 4000 0.4349 0.3731
0.1234 9.04 4500 0.4419 0.3784
0.1124 10.04 5000 0.4713 0.3741
0.0991 11.04 5500 0.4711 0.3692
0.0924 12.05 6000 0.4994 0.3699
0.0809 13.05 6500 0.4888 0.3643
0.0715 14.06 7000 0.4828 0.3634
0.0646 15.06 7500 0.5058 0.3570
0.0604 16.06 8000 0.5586 0.3637
0.0571 17.07 8500 0.4991 0.3553
0.0532 18.07 9000 0.5317 0.3566
0.0471 19.08 9500 0.5308 0.3508
0.0449 20.08 10000 0.5362 0.3486
0.0373 21.08 10500 0.5211 0.3479
0.0351 22.09 11000 0.5132 0.3445
0.0333 23.09 11500 0.4927 0.3381
0.0302 24.1 12000 0.5330 0.3413
0.0282 25.1 12500 0.5295 0.3396
0.024 26.1 13000 0.5022 0.3356
0.0262 27.11 13500 0.5320 0.3329
0.0242 28.11 14000 0.5133 0.3326
0.0201 29.12 14500 0.5198 0.3335

Framework versions

  • Transformers 4.32.0
  • Pytorch 2.0.1+cu118
  • Datasets 1.18.3
  • Tokenizers 0.13.3