Edit model card

las-20s_asr-scr_w2v2-base_002

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9105
  • Per: 0.1627
  • Pcc: 0.6615
  • Ctc Loss: 0.5322
  • Mse Loss: 1.2951

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 1
  • seed: 2222
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 2226
  • training_steps: 22260
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Per Pcc Ctc Loss Mse Loss
39.6857 1.0 742 18.9148 0.9897 0.3162 5.7239 13.2193
9.7583 2.0 1484 5.1408 0.9897 0.5908 3.8631 1.3041
4.8781 3.0 2226 5.1735 0.9897 0.6553 3.7845 1.4538
4.6095 4.0 2968 4.6937 0.9897 0.6826 3.7654 1.0306
4.4289 5.0 3710 4.6305 0.9897 0.6823 3.7502 1.0155
4.2524 6.0 4452 4.8477 0.9897 0.6824 3.6937 1.3131
4.0682 7.0 5194 4.6683 0.9745 0.6899 3.6168 1.2312
3.8969 8.0 5936 4.4112 0.9719 0.6860 3.5709 1.0439
3.7246 9.0 6678 4.3463 0.9717 0.6836 3.4542 1.1059
3.4041 10.0 7420 4.0102 0.9714 0.6710 3.0131 1.1831
2.8571 11.0 8162 3.2679 0.7278 0.6639 2.2983 1.0980
2.1949 12.0 8904 3.0018 0.4811 0.6546 1.6107 1.4258
1.6457 13.0 9646 2.4409 0.3084 0.6579 1.1488 1.2846
1.3213 14.0 10388 2.0040 0.2490 0.6628 0.9346 1.0566
1.1321 15.0 11130 2.0496 0.2280 0.6471 0.8158 1.1931
1.0106 16.0 11872 2.0461 0.2133 0.6625 0.7316 1.2561
0.9147 17.0 12614 1.9235 0.1973 0.6696 0.6863 1.1818
0.8449 18.0 13356 1.9574 0.1876 0.6619 0.6482 1.2421
0.7954 19.0 14098 2.1967 0.1836 0.6674 0.6185 1.4764
0.7419 20.0 14840 1.8551 0.1775 0.6709 0.5962 1.1941
0.7084 21.0 15582 1.9213 0.1746 0.6592 0.5807 1.2646
0.6668 22.0 16324 1.9464 0.1721 0.6689 0.5661 1.2983
0.6403 23.0 17066 1.8137 0.1685 0.6666 0.5603 1.1886
0.616 24.0 17808 1.7154 0.1664 0.6654 0.5502 1.1127
0.6046 25.0 18550 1.6261 0.1657 0.6642 0.5434 1.0422
0.5996 26.0 19292 1.6967 0.1648 0.6636 0.5424 1.1041
0.58 27.0 20034 1.8841 0.1640 0.6625 0.5367 1.2689
0.5592 28.0 20776 1.8360 0.1631 0.6626 0.5345 1.2297
0.5502 29.0 21518 1.9516 0.1630 0.6581 0.5339 1.3286
0.5398 30.0 22260 1.9105 0.1627 0.6615 0.5322 1.2951

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.0.1
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
94.6M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for excalibur12/las-20s_asr-scr_w2v2-base_002

Finetuned
(615)
this model