Edit model card

wav2vec2-base-timit-demo-google-colab

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5328
  • Wer: 0.2175

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
3.4608 1.0 500 1.4610 0.9290
0.796 2.01 1000 0.5283 0.4217
0.4219 3.01 1500 0.4277 0.3353
0.2919 4.02 2000 0.4154 0.3102
0.2263 5.02 2500 0.4096 0.2954
0.1885 6.02 3000 0.4274 0.2944
0.1595 7.03 3500 0.4529 0.2681
0.1371 8.03 4000 0.4309 0.2721
0.1218 9.04 4500 0.4574 0.2629
0.1118 10.04 5000 0.5396 0.2605
0.0988 11.04 5500 0.5031 0.2683
0.0944 12.05 6000 0.5040 0.2595
0.0781 13.05 6500 0.4909 0.2611
0.0714 14.06 7000 0.4740 0.2635
0.0648 15.06 7500 0.4613 0.2509
0.0628 16.06 8000 0.4731 0.2508
0.0539 17.07 8500 0.5100 0.2448
0.0542 18.07 9000 0.5048 0.2507
0.0453 19.08 9500 0.5290 0.2466
0.0446 20.08 10000 0.5482 0.2398
0.0405 21.08 10500 0.5768 0.2422
0.0368 22.09 11000 0.5848 0.2403
0.0349 23.09 11500 0.5469 0.2321
0.0326 24.1 12000 0.5618 0.2294
0.028 25.1 12500 0.5590 0.2297
0.0254 26.1 13000 0.5531 0.2291
0.0258 27.11 13500 0.5302 0.2215
0.0244 28.11 14000 0.5388 0.2188
0.0195 29.12 14500 0.5328 0.2175

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu118
  • Datasets 1.18.3
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
94.4M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for kinory24/wav2vec2-base-timit-demo-google-colab

Finetuned
(614)
this model