molto commited on
Commit
ce354ee
1 Parent(s): 21e6417

Training in progress, step 1000

Browse files
.gitignore ADDED
@@ -0,0 +1 @@
 
 
1
+ checkpoint-*/
README.md ADDED
@@ -0,0 +1,295 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: ft_1
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # ft_1
14
+
15
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.0186
18
+ - Cer: 0.0033
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 0.0001
38
+ - train_batch_size: 8
39
+ - eval_batch_size: 8
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - lr_scheduler_warmup_steps: 1000
44
+ - num_epochs: 20
45
+ - mixed_precision_training: Native AMP
46
+
47
+ ### Training results
48
+
49
+ | Training Loss | Epoch | Step | Validation Loss | Cer |
50
+ |:-------------:|:-----:|:------:|:---------------:|:------:|
51
+ | 5.7681 | 0.08 | 500 | 3.3424 | 1.0 |
52
+ | 3.0265 | 0.17 | 1000 | 2.4985 | 0.8032 |
53
+ | 1.28 | 0.25 | 1500 | 0.4905 | 0.1353 |
54
+ | 0.7128 | 0.34 | 2000 | 0.3396 | 0.1005 |
55
+ | 0.5924 | 0.42 | 2500 | 0.2566 | 0.0791 |
56
+ | 0.5196 | 0.51 | 3000 | 0.2411 | 0.0746 |
57
+ | 0.4867 | 0.59 | 3500 | 0.2115 | 0.0652 |
58
+ | 0.4509 | 0.67 | 4000 | 0.1884 | 0.0582 |
59
+ | 0.4284 | 0.76 | 4500 | 0.2251 | 0.0699 |
60
+ | 0.4025 | 0.84 | 5000 | 0.1846 | 0.0550 |
61
+ | 0.3863 | 0.93 | 5500 | 0.1412 | 0.0449 |
62
+ | 0.3687 | 1.01 | 6000 | 0.1389 | 0.0444 |
63
+ | 0.3568 | 1.1 | 6500 | 0.1246 | 0.0397 |
64
+ | 0.3445 | 1.18 | 7000 | 0.1206 | 0.0395 |
65
+ | 0.3317 | 1.26 | 7500 | 0.1464 | 0.0433 |
66
+ | 0.3194 | 1.35 | 8000 | 0.1152 | 0.0364 |
67
+ | 0.3165 | 1.43 | 8500 | 0.1105 | 0.0343 |
68
+ | 0.3096 | 1.52 | 9000 | 0.0994 | 0.0318 |
69
+ | 0.3008 | 1.6 | 9500 | 0.0939 | 0.0302 |
70
+ | 0.2933 | 1.69 | 10000 | 0.0920 | 0.0291 |
71
+ | 0.291 | 1.77 | 10500 | 0.0851 | 0.0280 |
72
+ | 0.2817 | 1.85 | 11000 | 0.0819 | 0.0265 |
73
+ | 0.2694 | 1.94 | 11500 | 0.1000 | 0.0289 |
74
+ | 0.273 | 2.02 | 12000 | 0.0940 | 0.0275 |
75
+ | 0.2521 | 2.11 | 12500 | 0.0742 | 0.0243 |
76
+ | 0.249 | 2.19 | 13000 | 0.0693 | 0.0230 |
77
+ | 0.2418 | 2.28 | 13500 | 0.0692 | 0.0224 |
78
+ | 0.2505 | 2.36 | 14000 | 0.0686 | 0.0219 |
79
+ | 0.2424 | 2.44 | 14500 | 0.0666 | 0.0215 |
80
+ | 0.2338 | 2.53 | 15000 | 0.0646 | 0.0211 |
81
+ | 0.2396 | 2.61 | 15500 | 0.0623 | 0.0203 |
82
+ | 0.2252 | 2.7 | 16000 | 0.0581 | 0.0189 |
83
+ | 0.227 | 2.78 | 16500 | 0.0546 | 0.0177 |
84
+ | 0.2206 | 2.87 | 17000 | 0.0557 | 0.0184 |
85
+ | 0.2239 | 2.95 | 17500 | 0.0610 | 0.0174 |
86
+ | 0.2085 | 3.03 | 18000 | 0.0518 | 0.0167 |
87
+ | 0.2043 | 3.12 | 18500 | 0.0554 | 0.0175 |
88
+ | 0.1947 | 3.2 | 19000 | 0.0525 | 0.0162 |
89
+ | 0.2002 | 3.29 | 19500 | 0.0508 | 0.0168 |
90
+ | 0.2036 | 3.37 | 20000 | 0.0483 | 0.0157 |
91
+ | 0.1974 | 3.46 | 20500 | 0.0515 | 0.0162 |
92
+ | 0.1911 | 3.54 | 21000 | 0.0490 | 0.0147 |
93
+ | 0.19 | 3.62 | 21500 | 0.0476 | 0.0152 |
94
+ | 0.1895 | 3.71 | 22000 | 0.0673 | 0.0181 |
95
+ | 0.1858 | 3.79 | 22500 | 0.0508 | 0.0150 |
96
+ | 0.1814 | 3.88 | 23000 | 0.0823 | 0.0199 |
97
+ | 0.1795 | 3.96 | 23500 | 0.0451 | 0.0135 |
98
+ | 0.1805 | 4.05 | 24000 | 0.0421 | 0.0137 |
99
+ | 0.1659 | 4.13 | 24500 | 0.0451 | 0.0132 |
100
+ | 0.1744 | 4.21 | 25000 | 0.0382 | 0.0120 |
101
+ | 0.1721 | 4.3 | 25500 | 0.0345 | 0.0118 |
102
+ | 0.1663 | 4.38 | 26000 | 0.0329 | 0.0106 |
103
+ | 0.1692 | 4.47 | 26500 | 0.0359 | 0.0114 |
104
+ | 0.1654 | 4.55 | 27000 | 0.0337 | 0.0110 |
105
+ | 0.1545 | 4.64 | 27500 | 0.0341 | 0.0108 |
106
+ | 0.1602 | 4.72 | 28000 | 0.0337 | 0.0113 |
107
+ | 0.1589 | 4.8 | 28500 | 0.0321 | 0.0100 |
108
+ | 0.1646 | 4.89 | 29000 | 0.0334 | 0.0101 |
109
+ | 0.1532 | 4.97 | 29500 | 0.0554 | 0.0130 |
110
+ | 0.152 | 5.06 | 30000 | 0.0725 | 0.0149 |
111
+ | 0.149 | 5.14 | 30500 | 0.2240 | 0.0299 |
112
+ | 0.1425 | 5.23 | 31000 | 0.0802 | 0.0153 |
113
+ | 0.1454 | 5.31 | 31500 | 0.1454 | 0.0236 |
114
+ | 0.1486 | 5.39 | 32000 | 0.0300 | 0.0087 |
115
+ | 0.1466 | 5.48 | 32500 | 0.0367 | 0.0095 |
116
+ | 0.1424 | 5.56 | 33000 | 0.0290 | 0.0086 |
117
+ | 0.1388 | 5.65 | 33500 | 0.0472 | 0.0110 |
118
+ | 0.1416 | 5.73 | 34000 | 0.0339 | 0.0086 |
119
+ | 0.14 | 5.82 | 34500 | 0.0238 | 0.0076 |
120
+ | 0.139 | 5.9 | 35000 | 0.0255 | 0.0080 |
121
+ | 0.1425 | 5.98 | 35500 | 0.0272 | 0.0074 |
122
+ | 0.1318 | 6.07 | 36000 | 0.0252 | 0.0075 |
123
+ | 0.1294 | 6.15 | 36500 | 0.0242 | 0.0076 |
124
+ | 0.1317 | 6.24 | 37000 | 0.0255 | 0.0070 |
125
+ | 0.1299 | 6.32 | 37500 | 0.0230 | 0.0068 |
126
+ | 0.1323 | 6.41 | 38000 | 0.0246 | 0.0070 |
127
+ | 0.1278 | 6.49 | 38500 | 0.0252 | 0.0069 |
128
+ | 0.1248 | 6.57 | 39000 | 0.0278 | 0.0074 |
129
+ | 0.1246 | 6.66 | 39500 | 0.0229 | 0.0065 |
130
+ | 0.1253 | 6.74 | 40000 | 0.0205 | 0.0062 |
131
+ | 0.1283 | 6.83 | 40500 | 0.0212 | 0.0059 |
132
+ | 0.1239 | 6.91 | 41000 | 0.0200 | 0.0060 |
133
+ | 0.1227 | 7.0 | 41500 | 0.0200 | 0.0059 |
134
+ | 0.1085 | 7.08 | 42000 | 0.0188 | 0.0056 |
135
+ | 0.12 | 7.16 | 42500 | 0.0203 | 0.0058 |
136
+ | 0.1165 | 7.25 | 43000 | 0.0245 | 0.0064 |
137
+ | 0.1085 | 7.33 | 43500 | 0.0203 | 0.0056 |
138
+ | 0.1124 | 7.42 | 44000 | 0.0185 | 0.0054 |
139
+ | 0.111 | 7.5 | 44500 | 0.0187 | 0.0054 |
140
+ | 0.11 | 7.59 | 45000 | 0.0191 | 0.0053 |
141
+ | 0.1126 | 7.67 | 45500 | 0.0190 | 0.0055 |
142
+ | 0.1112 | 7.75 | 46000 | 0.0181 | 0.0053 |
143
+ | 0.1163 | 7.84 | 46500 | 0.0216 | 0.0056 |
144
+ | 0.1081 | 7.92 | 47000 | 0.0225 | 0.0053 |
145
+ | 0.111 | 8.01 | 47500 | 0.0176 | 0.0048 |
146
+ | 0.1035 | 8.09 | 48000 | 0.0191 | 0.0049 |
147
+ | 0.1046 | 8.18 | 48500 | 0.0172 | 0.0049 |
148
+ | 0.1015 | 8.26 | 49000 | 0.0427 | 0.0080 |
149
+ | 0.1048 | 8.34 | 49500 | 0.0215 | 0.0049 |
150
+ | 0.108 | 8.43 | 50000 | 0.0257 | 0.0060 |
151
+ | 0.0983 | 8.51 | 50500 | 0.0188 | 0.0049 |
152
+ | 0.0983 | 8.6 | 51000 | 0.0243 | 0.0056 |
153
+ | 0.1006 | 8.68 | 51500 | 0.0167 | 0.0046 |
154
+ | 0.0999 | 8.77 | 52000 | 0.0144 | 0.0043 |
155
+ | 0.0963 | 8.85 | 52500 | 0.0150 | 0.0044 |
156
+ | 0.0981 | 8.93 | 53000 | 0.0145 | 0.0042 |
157
+ | 0.1001 | 9.02 | 53500 | 0.0142 | 0.0042 |
158
+ | 0.0894 | 9.1 | 54000 | 0.0153 | 0.0043 |
159
+ | 0.0952 | 9.19 | 54500 | 0.0151 | 0.0042 |
160
+ | 0.0904 | 9.27 | 55000 | 0.0149 | 0.0041 |
161
+ | 0.092 | 9.36 | 55500 | 0.0147 | 0.0041 |
162
+ | 0.0886 | 9.44 | 56000 | 0.0151 | 0.0041 |
163
+ | 0.0936 | 9.52 | 56500 | 0.0224 | 0.0038 |
164
+ | 0.0883 | 9.61 | 57000 | 0.0164 | 0.0041 |
165
+ | 0.0944 | 9.69 | 57500 | 0.0196 | 0.0042 |
166
+ | 0.0877 | 9.78 | 58000 | 0.0173 | 0.0040 |
167
+ | 0.0924 | 9.86 | 58500 | 0.0152 | 0.0041 |
168
+ | 0.0903 | 9.95 | 59000 | 0.0156 | 0.0039 |
169
+ | 0.0908 | 10.03 | 59500 | 0.0139 | 0.0037 |
170
+ | 0.087 | 10.11 | 60000 | 0.0134 | 0.0036 |
171
+ | 0.0832 | 10.2 | 60500 | 0.0136 | 0.0037 |
172
+ | 0.0842 | 10.28 | 61000 | 0.0129 | 0.0034 |
173
+ | 0.0839 | 10.37 | 61500 | 0.0137 | 0.0037 |
174
+ | 0.084 | 10.45 | 62000 | 0.0138 | 0.0036 |
175
+ | 0.0822 | 10.54 | 62500 | 0.0133 | 0.0035 |
176
+ | 0.0791 | 10.62 | 63000 | 0.0134 | 0.0036 |
177
+ | 0.088 | 10.7 | 63500 | 0.0172 | 0.0036 |
178
+ | 0.082 | 10.79 | 64000 | 0.0145 | 0.0036 |
179
+ | 0.0807 | 10.87 | 64500 | 0.0156 | 0.0036 |
180
+ | 0.0789 | 10.96 | 65000 | 0.0169 | 0.0036 |
181
+ | 0.0788 | 11.04 | 65500 | 0.0175 | 0.0035 |
182
+ | 0.0744 | 11.13 | 66000 | 0.0248 | 0.0034 |
183
+ | 0.0751 | 11.21 | 66500 | 0.0155 | 0.0035 |
184
+ | 0.0742 | 11.29 | 67000 | 0.0146 | 0.0034 |
185
+ | 0.0723 | 11.38 | 67500 | 0.0126 | 0.0032 |
186
+ | 0.0761 | 11.46 | 68000 | 0.0174 | 0.0034 |
187
+ | 0.0754 | 11.55 | 68500 | 0.0145 | 0.0032 |
188
+ | 0.0755 | 11.63 | 69000 | 0.0130 | 0.0032 |
189
+ | 0.068 | 11.72 | 69500 | 0.0142 | 0.0035 |
190
+ | 0.0731 | 11.8 | 70000 | 0.0171 | 0.0039 |
191
+ | 0.0742 | 11.88 | 70500 | 0.0180 | 0.0038 |
192
+ | 0.0704 | 11.97 | 71000 | 0.0172 | 0.0034 |
193
+ | 0.0677 | 12.05 | 71500 | 0.0127 | 0.0032 |
194
+ | 0.0673 | 12.14 | 72000 | 0.0186 | 0.0036 |
195
+ | 0.0701 | 12.22 | 72500 | 0.0157 | 0.0031 |
196
+ | 0.0711 | 12.31 | 73000 | 0.0166 | 0.0031 |
197
+ | 0.0652 | 12.39 | 73500 | 0.0174 | 0.0030 |
198
+ | 0.0649 | 12.47 | 74000 | 0.0129 | 0.0030 |
199
+ | 0.069 | 12.56 | 74500 | 0.0139 | 0.0032 |
200
+ | 0.0662 | 12.64 | 75000 | 0.0120 | 0.0029 |
201
+ | 0.0683 | 12.73 | 75500 | 0.0120 | 0.0032 |
202
+ | 0.0669 | 12.81 | 76000 | 0.0119 | 0.0028 |
203
+ | 0.0682 | 12.9 | 76500 | 0.0116 | 0.0028 |
204
+ | 0.0663 | 12.98 | 77000 | 0.0107 | 0.0028 |
205
+ | 0.0633 | 13.06 | 77500 | 0.0113 | 0.0029 |
206
+ | 0.0617 | 13.15 | 78000 | 0.0111 | 0.0028 |
207
+ | 0.0616 | 13.23 | 78500 | 0.0105 | 0.0028 |
208
+ | 0.0641 | 13.32 | 79000 | 0.0113 | 0.0028 |
209
+ | 0.0585 | 13.4 | 79500 | 0.0140 | 0.0032 |
210
+ | 0.06 | 13.49 | 80000 | 0.0114 | 0.0027 |
211
+ | 0.0605 | 13.57 | 80500 | 0.0124 | 0.0029 |
212
+ | 0.0591 | 13.65 | 81000 | 0.0117 | 0.0027 |
213
+ | 0.0598 | 13.74 | 81500 | 0.0125 | 0.0028 |
214
+ | 0.0584 | 13.82 | 82000 | 0.0130 | 0.0028 |
215
+ | 0.0583 | 13.91 | 82500 | 0.0166 | 0.0028 |
216
+ | 0.0636 | 13.99 | 83000 | 0.0128 | 0.0028 |
217
+ | 0.0551 | 14.08 | 83500 | 0.0113 | 0.0028 |
218
+ | 0.0569 | 14.16 | 84000 | 0.0116 | 0.0027 |
219
+ | 0.0584 | 14.24 | 84500 | 0.0152 | 0.0026 |
220
+ | 0.0569 | 14.33 | 85000 | 0.0113 | 0.0026 |
221
+ | 0.057 | 14.41 | 85500 | 0.0119 | 0.0027 |
222
+ | 0.059 | 14.5 | 86000 | 0.0112 | 0.0026 |
223
+ | 0.0575 | 14.58 | 86500 | 0.0113 | 0.0027 |
224
+ | 0.0557 | 14.67 | 87000 | 0.0140 | 0.0028 |
225
+ | 0.0541 | 14.75 | 87500 | 0.0116 | 0.0027 |
226
+ | 0.0522 | 14.83 | 88000 | 0.0108 | 0.0027 |
227
+ | 0.051 | 14.92 | 88500 | 0.0110 | 0.0026 |
228
+ | 0.0511 | 15.0 | 89000 | 0.0103 | 0.0026 |
229
+ | 0.0527 | 15.09 | 89500 | 0.0110 | 0.0026 |
230
+ | 0.0518 | 15.17 | 90000 | 0.0114 | 0.0026 |
231
+ | 0.0506 | 15.26 | 90500 | 0.0131 | 0.0029 |
232
+ | 0.0515 | 15.34 | 91000 | 0.0116 | 0.0026 |
233
+ | 0.0524 | 15.42 | 91500 | 0.0116 | 0.0027 |
234
+ | 0.0502 | 15.51 | 92000 | 0.0116 | 0.0027 |
235
+ | 0.0496 | 15.59 | 92500 | 0.0108 | 0.0026 |
236
+ | 0.0507 | 15.68 | 93000 | 0.0108 | 0.0025 |
237
+ | 0.049 | 15.76 | 93500 | 0.0102 | 0.0024 |
238
+ | 0.0526 | 15.85 | 94000 | 0.0107 | 0.0026 |
239
+ | 0.0483 | 15.93 | 94500 | 0.0098 | 0.0024 |
240
+ | 0.047 | 16.01 | 95000 | 0.0100 | 0.0023 |
241
+ | 0.0494 | 16.1 | 95500 | 0.0102 | 0.0024 |
242
+ | 0.0488 | 16.18 | 96000 | 0.0106 | 0.0025 |
243
+ | 0.0438 | 16.27 | 96500 | 0.0098 | 0.0023 |
244
+ | 0.047 | 16.35 | 97000 | 0.0100 | 0.0024 |
245
+ | 0.0477 | 16.44 | 97500 | 0.0096 | 0.0023 |
246
+ | 0.0465 | 16.52 | 98000 | 0.0114 | 0.0025 |
247
+ | 0.0468 | 16.6 | 98500 | 0.0098 | 0.0023 |
248
+ | 0.0465 | 16.69 | 99000 | 0.0111 | 0.0026 |
249
+ | 0.0455 | 16.77 | 99500 | 0.0098 | 0.0023 |
250
+ | 0.0461 | 16.86 | 100000 | 0.0111 | 0.0026 |
251
+ | 0.0453 | 16.94 | 100500 | 0.0130 | 0.0030 |
252
+ | 0.042 | 17.03 | 101000 | 0.0216 | 0.0041 |
253
+ | 0.0426 | 17.11 | 101500 | 0.0191 | 0.0036 |
254
+ | 0.0449 | 17.19 | 102000 | 0.0188 | 0.0036 |
255
+ | 0.042 | 17.28 | 102500 | 0.0119 | 0.0025 |
256
+ | 0.0459 | 17.36 | 103000 | 0.0134 | 0.0025 |
257
+ | 0.0431 | 17.45 | 103500 | 0.0234 | 0.0040 |
258
+ | 0.0434 | 17.53 | 104000 | 0.0250 | 0.0041 |
259
+ | 0.0407 | 17.62 | 104500 | 0.0316 | 0.0049 |
260
+ | 0.0418 | 17.7 | 105000 | 0.0312 | 0.0049 |
261
+ | 0.0396 | 17.78 | 105500 | 0.0295 | 0.0047 |
262
+ | 0.0444 | 17.87 | 106000 | 0.0170 | 0.0032 |
263
+ | 0.0421 | 17.95 | 106500 | 0.0226 | 0.0038 |
264
+ | 0.0414 | 18.04 | 107000 | 0.0271 | 0.0043 |
265
+ | 0.0372 | 18.12 | 107500 | 0.0173 | 0.0032 |
266
+ | 0.0406 | 18.21 | 108000 | 0.0159 | 0.0029 |
267
+ | 0.0407 | 18.29 | 108500 | 0.0144 | 0.0026 |
268
+ | 0.0395 | 18.37 | 109000 | 0.0161 | 0.0029 |
269
+ | 0.0402 | 18.46 | 109500 | 0.0172 | 0.0031 |
270
+ | 0.0421 | 18.54 | 110000 | 0.0177 | 0.0032 |
271
+ | 0.038 | 18.63 | 110500 | 0.0282 | 0.0044 |
272
+ | 0.0408 | 18.71 | 111000 | 0.0212 | 0.0036 |
273
+ | 0.0363 | 18.8 | 111500 | 0.0173 | 0.0030 |
274
+ | 0.036 | 18.88 | 112000 | 0.0192 | 0.0033 |
275
+ | 0.0393 | 18.96 | 112500 | 0.0185 | 0.0032 |
276
+ | 0.0403 | 19.05 | 113000 | 0.0175 | 0.0031 |
277
+ | 0.0392 | 19.13 | 113500 | 0.0198 | 0.0034 |
278
+ | 0.0382 | 19.22 | 114000 | 0.0197 | 0.0034 |
279
+ | 0.0372 | 19.3 | 114500 | 0.0185 | 0.0032 |
280
+ | 0.0375 | 19.39 | 115000 | 0.0187 | 0.0033 |
281
+ | 0.035 | 19.47 | 115500 | 0.0196 | 0.0034 |
282
+ | 0.0367 | 19.55 | 116000 | 0.0196 | 0.0034 |
283
+ | 0.0346 | 19.64 | 116500 | 0.0185 | 0.0033 |
284
+ | 0.037 | 19.72 | 117000 | 0.0190 | 0.0034 |
285
+ | 0.0372 | 19.81 | 117500 | 0.0187 | 0.0033 |
286
+ | 0.0365 | 19.89 | 118000 | 0.0188 | 0.0033 |
287
+ | 0.034 | 19.98 | 118500 | 0.0186 | 0.0033 |
288
+
289
+
290
+ ### Framework versions
291
+
292
+ - Transformers 4.29.2
293
+ - Pytorch 2.0.1+cu117
294
+ - Datasets 2.13.0
295
+ - Tokenizers 0.13.3
config.json CHANGED
@@ -1,6 +1,7 @@
1
  {
2
  "_name_or_path": "facebook/wav2vec2-xls-r-300m",
3
  "activation_dropout": 0.0,
 
4
  "adapter_kernel_size": 3,
5
  "adapter_stride": 2,
6
  "add_adapter": false,
@@ -101,8 +102,8 @@
101
  1
102
  ],
103
  "torch_dtype": "float32",
104
- "transformers_version": "4.29.2",
105
  "use_weighted_layer_sum": false,
106
- "vocab_size": 50,
107
  "xvector_output_dim": 512
108
  }
 
1
  {
2
  "_name_or_path": "facebook/wav2vec2-xls-r-300m",
3
  "activation_dropout": 0.0,
4
+ "adapter_attn_dim": null,
5
  "adapter_kernel_size": 3,
6
  "adapter_stride": 2,
7
  "add_adapter": false,
 
102
  1
103
  ],
104
  "torch_dtype": "float32",
105
+ "transformers_version": "4.35.2",
106
  "use_weighted_layer_sum": false,
107
+ "vocab_size": 52,
108
  "xvector_output_dim": 512
109
  }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b08204487fe34271023aa593a87b70ed4251464cead582bf99df2b34da3d4707
3
+ size 1262020680
runs/Jan04_15-05-44_DESKTOP-CE5K89C/events.out.tfevents.1704350035.DESKTOP-CE5K89C.15664.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a02db029415cfd3101d124f396bc09129e373f686bc5375a72746aeafdc123dc
3
+ size 6582
runs/Jun30_12-03-47_DESKTOP-CE5K89C/1688095847.1366158/events.out.tfevents.1688095847.DESKTOP-CE5K89C.14868.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de1ef41a83281ac3875b83c2f2433c2ab55205ff027e57e9487df05de45d4c5c
3
+ size 5836
runs/Jun30_12-03-47_DESKTOP-CE5K89C/events.out.tfevents.1688095847.DESKTOP-CE5K89C.14868.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:529d5fe5886634a0b1858a7d101138fffe1395e33c4d2c03d02572609a913cd7
3
+ size 120142
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2f52742c4522553a7cee34064fd0eab9080e2362d9fa6e95b6480b8baf9a0079
3
- size 3899
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b09e5ac00bc48c594a1e144fe531d7dd84249cc6daf467a8f2c75d2326b25b24
3
+ size 4536