k2e-20s_asr-scr_w2v2-base_002
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.4632
- Per: 0.1621
- Pcc: 0.5775
- Ctc Loss: 0.5292
- Mse Loss: 0.9318
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 1
- seed: 2222
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2235
- training_steps: 22350
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Per | Pcc | Ctc Loss | Mse Loss |
---|---|---|---|---|---|---|---|
43.3938 | 1.0 | 745 | 18.4290 | 0.9890 | 0.1462 | 6.2276 | 12.2287 |
9.2178 | 2.0 | 1490 | 4.8023 | 0.9890 | 0.4671 | 3.8571 | 0.9719 |
4.753 | 3.0 | 2235 | 4.7789 | 0.9890 | 0.5587 | 3.7983 | 1.0460 |
4.5085 | 4.01 | 2980 | 4.4859 | 0.9890 | 0.5896 | 3.6917 | 0.8937 |
4.2649 | 5.01 | 3725 | 4.2852 | 0.9890 | 0.6067 | 3.6166 | 0.7981 |
4.0688 | 6.01 | 4470 | 4.2684 | 0.9632 | 0.6001 | 3.5490 | 0.8725 |
3.9012 | 7.01 | 5215 | 4.3288 | 0.9628 | 0.5973 | 3.5020 | 0.9987 |
3.6391 | 8.01 | 5960 | 4.2077 | 0.9430 | 0.5765 | 3.1575 | 1.2098 |
3.0582 | 9.01 | 6705 | 3.4214 | 0.7155 | 0.5656 | 2.3835 | 1.1480 |
2.3307 | 10.01 | 7450 | 2.4659 | 0.4524 | 0.5672 | 1.6378 | 0.8917 |
1.7509 | 11.01 | 8195 | 2.3288 | 0.3189 | 0.5824 | 1.1749 | 1.1589 |
1.3679 | 12.02 | 8940 | 2.0410 | 0.2541 | 0.5669 | 0.9275 | 1.0986 |
1.1463 | 13.02 | 9685 | 1.8387 | 0.2238 | 0.5714 | 0.8029 | 1.0158 |
1.013 | 14.02 | 10430 | 1.6558 | 0.2068 | 0.5635 | 0.7253 | 0.9141 |
0.912 | 15.02 | 11175 | 1.6381 | 0.1992 | 0.5530 | 0.6788 | 0.9381 |
0.8321 | 16.02 | 11920 | 1.8336 | 0.1908 | 0.5660 | 0.6407 | 1.1459 |
0.7676 | 17.02 | 12665 | 1.6002 | 0.1819 | 0.5851 | 0.6197 | 0.9574 |
0.7186 | 18.02 | 13410 | 1.6110 | 0.1807 | 0.5562 | 0.5981 | 0.9870 |
0.6707 | 19.03 | 14155 | 1.6138 | 0.1748 | 0.5735 | 0.5827 | 1.0041 |
0.6362 | 20.03 | 14900 | 1.5090 | 0.1729 | 0.5706 | 0.5688 | 0.9264 |
0.6016 | 21.03 | 15645 | 1.5540 | 0.1698 | 0.5742 | 0.5619 | 0.9732 |
0.5724 | 22.03 | 16390 | 1.5076 | 0.1686 | 0.5846 | 0.5517 | 0.9435 |
0.5434 | 23.03 | 17135 | 1.4442 | 0.1676 | 0.5753 | 0.5443 | 0.8970 |
0.5272 | 24.03 | 17880 | 1.4617 | 0.1656 | 0.5699 | 0.5409 | 0.9165 |
0.5119 | 25.03 | 18625 | 1.4886 | 0.1642 | 0.5654 | 0.5400 | 0.9414 |
0.4963 | 26.03 | 19370 | 1.4959 | 0.1644 | 0.5751 | 0.5342 | 0.9534 |
0.4882 | 27.04 | 20115 | 1.4686 | 0.1634 | 0.5711 | 0.5320 | 0.9329 |
0.4697 | 28.04 | 20860 | 1.4663 | 0.1627 | 0.5730 | 0.5302 | 0.9330 |
0.4604 | 29.04 | 21605 | 1.4417 | 0.1623 | 0.5782 | 0.5293 | 0.9134 |
0.4536 | 30.04 | 22350 | 1.4632 | 0.1621 | 0.5775 | 0.5292 | 0.9318 |
Framework versions
- Transformers 4.38.1
- Pytorch 2.0.1
- Datasets 2.16.1
- Tokenizers 0.15.2
- Downloads last month
- 0
Model tree for excalibur12/k2e-20s_asr-scr_w2v2-base_002
Base model
facebook/wav2vec2-base