metadata
license: apache-2.0
base_model: openai/whisper-tiny.en
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: whisper-tiny-finetune
results: []
whisper-tiny-finetune
This model is a fine-tuned version of openai/whisper-tiny.en on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5792
- Wer: 20.6820
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 128
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 1000
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
4.1356 | 0.2778 | 10 | 4.1201 | 47.9002 |
4.0312 | 0.5556 | 20 | 4.0231 | 47.3319 |
3.917 | 0.8333 | 30 | 3.8659 | 46.5425 |
3.7606 | 1.1111 | 40 | 3.6569 | 45.8478 |
3.4823 | 1.3889 | 50 | 3.3969 | 44.5216 |
3.0938 | 1.6667 | 60 | 3.0765 | 41.9324 |
2.7895 | 1.9444 | 70 | 2.6692 | 34.9542 |
2.3101 | 2.2222 | 80 | 2.1389 | 34.5122 |
1.6935 | 2.5 | 90 | 1.5546 | 34.8911 |
1.1419 | 2.7778 | 100 | 1.0650 | 36.5330 |
0.904 | 3.0556 | 110 | 0.8400 | 29.4601 |
0.7536 | 3.3333 | 120 | 0.7657 | 28.9233 |
0.6857 | 3.6111 | 130 | 0.7202 | 27.7550 |
0.6609 | 3.8889 | 140 | 0.6886 | 26.6814 |
0.5804 | 4.1667 | 150 | 0.6656 | 25.6710 |
0.5611 | 4.4444 | 160 | 0.6465 | 25.0710 |
0.5574 | 4.7222 | 170 | 0.6293 | 24.3448 |
0.552 | 5.0 | 180 | 0.6135 | 24.0606 |
0.4717 | 5.2778 | 190 | 0.6024 | 24.5974 |
0.4681 | 5.5556 | 200 | 0.5898 | 24.0290 |
0.4679 | 5.8333 | 210 | 0.5778 | 23.5238 |
0.4351 | 6.1111 | 220 | 0.5670 | 23.6501 |
0.3982 | 6.3889 | 230 | 0.5599 | 23.2081 |
0.3892 | 6.6667 | 240 | 0.5520 | 22.0714 |
0.3771 | 6.9444 | 250 | 0.5439 | 21.1872 |
0.3532 | 7.2222 | 260 | 0.5372 | 21.6925 |
0.3435 | 7.5 | 270 | 0.5309 | 27.5024 |
0.336 | 7.7778 | 280 | 0.5253 | 20.9346 |
0.3088 | 8.0556 | 290 | 0.5201 | 20.4610 |
0.3014 | 8.3333 | 300 | 0.5184 | 20.5242 |
0.316 | 8.6111 | 310 | 0.5146 | 20.2400 |
0.2931 | 8.8889 | 320 | 0.5118 | 19.9874 |
0.2228 | 9.1667 | 330 | 0.5079 | 20.3663 |
0.2445 | 9.4444 | 340 | 0.5052 | 20.2716 |
0.2343 | 9.7222 | 350 | 0.5039 | 20.2084 |
0.2893 | 10.0 | 360 | 0.5023 | 20.0189 |
0.2014 | 10.2778 | 370 | 0.5030 | 20.0505 |
0.2048 | 10.5556 | 380 | 0.5036 | 19.6400 |
0.1941 | 10.8333 | 390 | 0.5003 | 20.1137 |
0.1601 | 11.1111 | 400 | 0.4992 | 19.8295 |
0.1647 | 11.3889 | 410 | 0.5010 | 19.8926 |
0.1519 | 11.6667 | 420 | 0.5044 | 19.6716 |
0.1747 | 11.9444 | 430 | 0.5005 | 20.1137 |
0.1194 | 12.2222 | 440 | 0.5076 | 20.7452 |
0.1021 | 12.5 | 450 | 0.5104 | 19.9242 |
0.1115 | 12.7778 | 460 | 0.5102 | 20.7136 |
0.1355 | 13.0556 | 470 | 0.5068 | 20.3979 |
0.0824 | 13.3333 | 480 | 0.5152 | 20.5557 |
0.0858 | 13.6111 | 490 | 0.5189 | 20.3663 |
0.0786 | 13.8889 | 500 | 0.5225 | 21.1557 |
0.0564 | 14.1667 | 510 | 0.5250 | 20.9031 |
0.056 | 14.4444 | 520 | 0.5232 | 20.8715 |
0.0558 | 14.7222 | 530 | 0.5282 | 20.5557 |
0.0657 | 15.0 | 540 | 0.5299 | 20.7452 |
0.0369 | 15.2778 | 550 | 0.5342 | 20.6505 |
0.0355 | 15.5556 | 560 | 0.5341 | 20.1137 |
0.0383 | 15.8333 | 570 | 0.5370 | 20.4926 |
0.0333 | 16.1111 | 580 | 0.5401 | 20.5557 |
0.027 | 16.3889 | 590 | 0.5455 | 20.9346 |
0.0261 | 16.6667 | 600 | 0.5480 | 20.6189 |
0.024 | 16.9444 | 610 | 0.5494 | 20.4294 |
0.0164 | 17.2222 | 620 | 0.5505 | 20.3663 |
0.0159 | 17.5 | 630 | 0.5577 | 20.7136 |
0.0168 | 17.7778 | 640 | 0.5549 | 20.9031 |
0.015 | 18.0556 | 650 | 0.5555 | 20.8083 |
0.0116 | 18.3333 | 660 | 0.5596 | 20.9978 |
0.0131 | 18.6111 | 670 | 0.5614 | 20.9346 |
0.0121 | 18.8889 | 680 | 0.5634 | 20.3663 |
0.009 | 19.1667 | 690 | 0.5643 | 20.7452 |
0.0108 | 19.4444 | 700 | 0.5633 | 20.3031 |
0.0096 | 19.7222 | 710 | 0.5666 | 20.3979 |
0.0123 | 20.0 | 720 | 0.5660 | 20.4610 |
0.009 | 20.2778 | 730 | 0.5695 | 20.5242 |
0.0099 | 20.5556 | 740 | 0.5684 | 20.3663 |
0.0079 | 20.8333 | 750 | 0.5701 | 20.7768 |
0.008 | 21.1111 | 760 | 0.5701 | 20.7136 |
0.0084 | 21.3889 | 770 | 0.5719 | 20.7136 |
0.0076 | 21.6667 | 780 | 0.5724 | 20.4610 |
0.0081 | 21.9444 | 790 | 0.5724 | 20.7136 |
0.0067 | 22.2222 | 800 | 0.5731 | 20.6820 |
0.0076 | 22.5 | 810 | 0.5737 | 20.4926 |
0.0079 | 22.7778 | 820 | 0.5748 | 20.3979 |
0.0069 | 23.0556 | 830 | 0.5747 | 20.6820 |
0.0066 | 23.3333 | 840 | 0.5751 | 20.7136 |
0.0062 | 23.6111 | 850 | 0.5755 | 20.7136 |
0.0071 | 23.8889 | 860 | 0.5764 | 20.5873 |
0.0062 | 24.1667 | 870 | 0.5774 | 20.7136 |
0.0059 | 24.4444 | 880 | 0.5769 | 20.5873 |
0.0066 | 24.7222 | 890 | 0.5772 | 20.6189 |
0.0066 | 25.0 | 900 | 0.5778 | 20.5873 |
0.0066 | 25.2778 | 910 | 0.5779 | 20.5557 |
0.0062 | 25.5556 | 920 | 0.5781 | 20.5873 |
0.006 | 25.8333 | 930 | 0.5787 | 20.6189 |
0.0061 | 26.1111 | 940 | 0.5789 | 20.5873 |
0.0056 | 26.3889 | 950 | 0.5788 | 20.5557 |
0.006 | 26.6667 | 960 | 0.5789 | 20.5873 |
0.0055 | 26.9444 | 970 | 0.5790 | 20.5873 |
0.0057 | 27.2222 | 980 | 0.5791 | 20.6189 |
0.0063 | 27.5 | 990 | 0.5792 | 20.6820 |
0.0059 | 27.7778 | 1000 | 0.5792 | 20.6820 |
Framework versions
- Transformers 4.40.1
- Pytorch 2.2.1+cu121
- Datasets 2.19.1.dev0
- Tokenizers 0.19.1