Edit model card

ft_1

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0074
  • Cer: 0.0011

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
9.6728 0.08 500 0.9218 1.0
0.6369 0.17 1000 0.6685 1.0
0.511 0.25 1500 0.4867 0.5464
0.4632 0.34 2000 0.4821 0.2281
0.4354 0.42 2500 0.3887 0.1763
0.3664 0.51 3000 0.2324 0.0484
0.2641 0.59 3500 0.1729 0.0276
0.218 0.67 4000 0.1485 0.0213
0.1938 0.76 4500 0.1270 0.0188
0.1867 0.84 5000 0.1192 0.0177
0.1756 0.93 5500 0.1047 0.0163
0.1659 1.01 6000 0.1008 0.0150
0.1522 1.1 6500 0.0970 0.0140
0.1456 1.18 7000 0.0944 0.0140
0.1431 1.26 7500 0.0905 0.0134
0.1416 1.35 8000 0.0919 0.0120
0.1356 1.43 8500 0.0792 0.0124
0.1345 1.52 9000 0.0880 0.0129
0.1316 1.6 9500 0.0753 0.0111
0.1293 1.69 10000 0.0728 0.0106
0.1238 1.77 10500 0.0682 0.0109
0.121 1.85 11000 0.0645 0.0100
0.1161 1.94 11500 0.0658 0.0094
0.1107 2.02 12000 0.0669 0.0115
0.1054 2.11 12500 0.0597 0.0093
0.104 2.19 13000 0.0572 0.0088
0.1007 2.28 13500 0.0522 0.0082
0.0986 2.36 14000 0.0575 0.0090
0.103 2.44 14500 0.0513 0.0085
0.0973 2.53 15000 0.0597 0.0081
0.0949 2.61 15500 0.0474 0.0074
0.092 2.7 16000 0.0538 0.0083
0.0944 2.78 16500 0.0471 0.0078
0.0896 2.87 17000 0.0417 0.0064
0.0895 2.95 17500 0.0406 0.0063
0.0866 3.03 18000 0.0393 0.0064
0.0795 3.12 18500 0.0377 0.0067
0.0803 3.2 19000 0.0380 0.0063
0.0771 3.29 19500 0.0386 0.0058
0.0742 3.37 20000 0.0347 0.0057
0.0743 3.46 20500 0.0363 0.0062
0.073 3.54 21000 0.0314 0.0051
0.0727 3.62 21500 0.0293 0.0050
0.0693 3.71 22000 0.0331 0.0057
0.0689 3.79 22500 0.0306 0.0050
0.0712 3.88 23000 0.0304 0.0049
0.0697 3.96 23500 0.0287 0.0050
0.0656 4.05 24000 0.0269 0.0046
0.0624 4.13 24500 0.0282 0.0048
0.0614 4.21 25000 0.0262 0.0046
0.0589 4.3 25500 0.0245 0.0044
0.0596 4.38 26000 0.0237 0.0043
0.0589 4.47 26500 0.0233 0.0045
0.059 4.55 27000 0.0275 0.0044
0.0564 4.64 27500 0.0241 0.0042
0.0589 4.72 28000 0.0209 0.0044
0.055 4.8 28500 0.0199 0.0039
0.0629 4.89 29000 0.0223 0.0046
0.0581 4.97 29500 0.0234 0.0044
0.0529 5.06 30000 0.0197 0.0037
0.0489 5.14 30500 0.0232 0.0044
0.0512 5.23 31000 0.0202 0.0039
0.0501 5.31 31500 0.0182 0.0036
0.0494 5.39 32000 0.0181 0.0034
0.049 5.48 32500 0.0173 0.0034
0.0482 5.56 33000 0.0173 0.0034
0.0505 5.65 33500 0.0172 0.0033
0.048 5.73 34000 0.0168 0.0034
0.0485 5.82 34500 0.0162 0.0032
0.0461 5.9 35000 0.0154 0.0032
0.0436 5.98 35500 0.0152 0.0033
0.043 6.07 36000 0.0157 0.0033
0.0429 6.15 36500 0.0160 0.0036
0.0402 6.24 37000 0.0169 0.0034
0.043 6.32 37500 0.0135 0.0030
0.0409 6.41 38000 0.0167 0.0033
0.0446 6.49 38500 0.0155 0.0035
0.0402 6.57 39000 0.0145 0.0029
0.0427 6.66 39500 0.0139 0.0028
0.0392 6.74 40000 0.0148 0.0028
0.0395 6.83 40500 0.0144 0.0026
0.0396 6.91 41000 0.0208 0.0035
0.0394 7.0 41500 0.0127 0.0026
0.0367 7.08 42000 0.0129 0.0023
0.0368 7.16 42500 0.0120 0.0024
0.0345 7.25 43000 0.0119 0.0027
0.0342 7.33 43500 0.0123 0.0026
0.0368 7.42 44000 0.0109 0.0024
0.0358 7.5 44500 0.0120 0.0026
0.036 7.59 45000 0.0110 0.0026
0.0332 7.67 45500 0.0115 0.0024
0.0356 7.75 46000 0.0128 0.0024
0.0353 7.84 46500 0.0123 0.0024
0.0365 7.92 47000 0.0106 0.0023
0.0341 8.01 47500 0.0110 0.0023
0.0286 8.09 48000 0.0109 0.0023
0.0324 8.18 48500 0.0099 0.0022
0.0321 8.26 49000 0.0111 0.0023
0.0333 8.34 49500 0.0110 0.0023
0.0302 8.43 50000 0.0116 0.0026
0.0325 8.51 50500 0.0123 0.0025
0.0317 8.6 51000 0.0101 0.0027
0.032 8.68 51500 0.0108 0.0024
0.0298 8.77 52000 0.0105 0.0024
0.0304 8.85 52500 0.0113 0.0022
0.0315 8.93 53000 0.0109 0.0023
0.0299 9.02 53500 0.0102 0.0022
0.029 9.1 54000 0.0108 0.0021
0.0261 9.19 54500 0.0112 0.0022
0.0286 9.27 55000 0.0116 0.0023
0.0293 9.36 55500 0.0108 0.0023
0.0267 9.44 56000 0.0109 0.0023
0.0271 9.52 56500 0.0103 0.0021
0.0276 9.61 57000 0.0102 0.0021
0.0256 9.69 57500 0.0101 0.0021
0.0264 9.78 58000 0.0131 0.0021
0.0277 9.86 58500 0.0096 0.0021
0.0303 9.95 59000 0.0097 0.0020
0.027 10.03 59500 0.0102 0.0021
0.0241 10.11 60000 0.0089 0.0020
0.0226 10.2 60500 0.0108 0.0020
0.0248 10.28 61000 0.0109 0.0021
0.0244 10.37 61500 0.0093 0.0020
0.025 10.45 62000 0.0167 0.0021
0.0238 10.54 62500 0.0123 0.0019
0.0253 10.62 63000 0.0089 0.0019
0.0247 10.7 63500 0.0090 0.0019
0.0229 10.79 64000 0.0084 0.0017
0.0242 10.87 64500 0.0086 0.0019
0.0251 10.96 65000 0.0082 0.0017
0.0222 11.04 65500 0.0081 0.0017
0.0233 11.13 66000 0.0078 0.0018
0.0226 11.21 66500 0.0081 0.0017
0.0227 11.29 67000 0.0096 0.0018
0.024 11.38 67500 0.0081 0.0017
0.0217 11.46 68000 0.0088 0.0019
0.0222 11.55 68500 0.0086 0.0019
0.0229 11.63 69000 0.0091 0.0021
0.02 11.72 69500 0.0094 0.0020
0.022 11.8 70000 0.0095 0.0017
0.0232 11.88 70500 0.0078 0.0018
0.0216 11.97 71000 0.0088 0.0017
0.0216 12.05 71500 0.0081 0.0017
0.0182 12.14 72000 0.0088 0.0018
0.0196 12.22 72500 0.0092 0.0017
0.0203 12.31 73000 0.0083 0.0018
0.0191 12.39 73500 0.0086 0.0018
0.0192 12.47 74000 0.0122 0.0016
0.0194 12.56 74500 0.0084 0.0016
0.0192 12.64 75000 0.0098 0.0016
0.0192 12.73 75500 0.0101 0.0017
0.0191 12.81 76000 0.0100 0.0016
0.0191 12.9 76500 0.0091 0.0016
0.0184 12.98 77000 0.0084 0.0017
0.0188 13.06 77500 0.0080 0.0016
0.0188 13.15 78000 0.0094 0.0017
0.0172 13.23 78500 0.0098 0.0017
0.0174 13.32 79000 0.0134 0.0015
0.0177 13.4 79500 0.0106 0.0016
0.0178 13.49 80000 0.0100 0.0014
0.017 13.57 80500 0.0104 0.0016
0.0169 13.65 81000 0.0094 0.0014
0.0189 13.74 81500 0.0089 0.0015
0.0172 13.82 82000 0.0086 0.0016
0.0167 13.91 82500 0.0091 0.0015
0.0179 13.99 83000 0.0088 0.0015
0.0175 14.08 83500 0.0076 0.0014
0.0164 14.16 84000 0.0082 0.0013
0.0143 14.24 84500 0.0080 0.0015
0.0158 14.33 85000 0.0082 0.0014
0.0153 14.41 85500 0.0086 0.0016
0.0173 14.5 86000 0.0077 0.0015
0.0149 14.58 86500 0.0084 0.0016
0.0153 14.67 87000 0.0078 0.0015
0.0157 14.75 87500 0.0074 0.0014
0.0173 14.83 88000 0.0086 0.0015
0.0171 14.92 88500 0.0080 0.0014
0.0148 15.0 89000 0.0073 0.0013
0.0145 15.09 89500 0.0074 0.0014
0.0153 15.17 90000 0.0068 0.0014
0.0156 15.26 90500 0.0070 0.0014
0.0143 15.34 91000 0.0072 0.0014
0.0143 15.42 91500 0.0069 0.0015
0.0139 15.51 92000 0.0070 0.0014
0.0149 15.59 92500 0.0087 0.0015
0.0124 15.68 93000 0.0074 0.0013
0.0139 15.76 93500 0.0076 0.0013
0.0148 15.85 94000 0.0074 0.0014
0.013 15.93 94500 0.0073 0.0014
0.0138 16.01 95000 0.0069 0.0013
0.0148 16.1 95500 0.0069 0.0013
0.0135 16.18 96000 0.0067 0.0014
0.0143 16.27 96500 0.0067 0.0012
0.014 16.35 97000 0.0071 0.0013
0.0138 16.44 97500 0.0072 0.0012
0.0132 16.52 98000 0.0070 0.0012
0.0122 16.6 98500 0.0077 0.0013
0.0124 16.69 99000 0.0078 0.0013
0.0138 16.77 99500 0.0070 0.0012
0.0118 16.86 100000 0.0071 0.0013
0.0123 16.94 100500 0.0068 0.0013
0.0113 17.03 101000 0.0070 0.0013
0.011 17.11 101500 0.0076 0.0013
0.0117 17.19 102000 0.0073 0.0013
0.0123 17.28 102500 0.0078 0.0012
0.0132 17.36 103000 0.0072 0.0013
0.0105 17.45 103500 0.0078 0.0012
0.0115 17.53 104000 0.0078 0.0012
0.0112 17.62 104500 0.0076 0.0012
0.0108 17.7 105000 0.0080 0.0012
0.0106 17.78 105500 0.0080 0.0012
0.0114 17.87 106000 0.0074 0.0012
0.0127 17.95 106500 0.0070 0.0012
0.0105 18.04 107000 0.0073 0.0012
0.0108 18.12 107500 0.0083 0.0012
0.0107 18.21 108000 0.0088 0.0012
0.0102 18.29 108500 0.0080 0.0011
0.0089 18.37 109000 0.0073 0.0011
0.0107 18.46 109500 0.0073 0.0011
0.0112 18.54 110000 0.0078 0.0011
0.0106 18.63 110500 0.0077 0.0011
0.0101 18.71 111000 0.0078 0.0011
0.0111 18.8 111500 0.0080 0.0011
0.0106 18.88 112000 0.0073 0.0011
0.0102 18.96 112500 0.0075 0.0011
0.0098 19.05 113000 0.0075 0.0011
0.0118 19.13 113500 0.0078 0.0011
0.0094 19.22 114000 0.0077 0.0011
0.009 19.3 114500 0.0077 0.0011
0.0094 19.39 115000 0.0077 0.0011
0.0084 19.47 115500 0.0074 0.0010
0.011 19.55 116000 0.0076 0.0011
0.0106 19.64 116500 0.0073 0.0011
0.009 19.72 117000 0.0074 0.0011
0.0097 19.81 117500 0.0074 0.0011
0.0095 19.89 118000 0.0074 0.0011
0.0094 19.98 118500 0.0074 0.0011

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.13.0
  • Tokenizers 0.15.0
Downloads last month
8
Safetensors
Model size
315M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for molto/ft_1

Finetuned
(447)
this model