Sandipan1994 commited on
Commit
16e12ed
1 Parent(s): 8b587c0

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +306 -0
README.md ADDED
@@ -0,0 +1,306 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ model-index:
6
+ - name: t5-small-entailement-Writer-T5-base
7
+ results: []
8
+ ---
9
+
10
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
+ should probably proofread and complete it, then remove this comment. -->
12
+
13
+ # t5-small-entailement-Writer-T5-base
14
+
15
+ This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset.
16
+ It achieves the following results on the evaluation set:
17
+ - Loss: 0.5697
18
+
19
+ ## Model description
20
+
21
+ More information needed
22
+
23
+ ## Intended uses & limitations
24
+
25
+ More information needed
26
+
27
+ ## Training and evaluation data
28
+
29
+ More information needed
30
+
31
+ ## Training procedure
32
+
33
+ ### Training hyperparameters
34
+
35
+ The following hyperparameters were used during training:
36
+ - learning_rate: 2e-05
37
+ - train_batch_size: 32
38
+ - eval_batch_size: 32
39
+ - seed: 42
40
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
+ - lr_scheduler_type: linear
42
+ - num_epochs: 250
43
+ - mixed_precision_training: Native AMP
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:-----:|:-----:|:---------------:|
49
+ | No log | 1.0 | 42 | 1.8185 |
50
+ | No log | 2.0 | 84 | 1.1957 |
51
+ | No log | 3.0 | 126 | 0.9771 |
52
+ | No log | 4.0 | 168 | 0.8964 |
53
+ | No log | 5.0 | 210 | 0.8380 |
54
+ | No log | 6.0 | 252 | 0.8109 |
55
+ | No log | 7.0 | 294 | 0.7886 |
56
+ | No log | 8.0 | 336 | 0.7760 |
57
+ | No log | 9.0 | 378 | 0.7577 |
58
+ | No log | 10.0 | 420 | 0.7483 |
59
+ | No log | 11.0 | 462 | 0.7364 |
60
+ | 1.2044 | 12.0 | 504 | 0.7267 |
61
+ | 1.2044 | 13.0 | 546 | 0.7205 |
62
+ | 1.2044 | 14.0 | 588 | 0.7102 |
63
+ | 1.2044 | 15.0 | 630 | 0.7048 |
64
+ | 1.2044 | 16.0 | 672 | 0.7015 |
65
+ | 1.2044 | 17.0 | 714 | 0.6958 |
66
+ | 1.2044 | 18.0 | 756 | 0.6892 |
67
+ | 1.2044 | 19.0 | 798 | 0.6877 |
68
+ | 1.2044 | 20.0 | 840 | 0.6825 |
69
+ | 1.2044 | 21.0 | 882 | 0.6790 |
70
+ | 1.2044 | 22.0 | 924 | 0.6732 |
71
+ | 1.2044 | 23.0 | 966 | 0.6676 |
72
+ | 0.736 | 24.0 | 1008 | 0.6640 |
73
+ | 0.736 | 25.0 | 1050 | 0.6631 |
74
+ | 0.736 | 26.0 | 1092 | 0.6617 |
75
+ | 0.736 | 27.0 | 1134 | 0.6556 |
76
+ | 0.736 | 28.0 | 1176 | 0.6551 |
77
+ | 0.736 | 29.0 | 1218 | 0.6545 |
78
+ | 0.736 | 30.0 | 1260 | 0.6483 |
79
+ | 0.736 | 31.0 | 1302 | 0.6493 |
80
+ | 0.736 | 32.0 | 1344 | 0.6488 |
81
+ | 0.736 | 33.0 | 1386 | 0.6434 |
82
+ | 0.736 | 34.0 | 1428 | 0.6427 |
83
+ | 0.736 | 35.0 | 1470 | 0.6403 |
84
+ | 0.6568 | 36.0 | 1512 | 0.6364 |
85
+ | 0.6568 | 37.0 | 1554 | 0.6342 |
86
+ | 0.6568 | 38.0 | 1596 | 0.6325 |
87
+ | 0.6568 | 39.0 | 1638 | 0.6300 |
88
+ | 0.6568 | 40.0 | 1680 | 0.6302 |
89
+ | 0.6568 | 41.0 | 1722 | 0.6292 |
90
+ | 0.6568 | 42.0 | 1764 | 0.6264 |
91
+ | 0.6568 | 43.0 | 1806 | 0.6272 |
92
+ | 0.6568 | 44.0 | 1848 | 0.6252 |
93
+ | 0.6568 | 45.0 | 1890 | 0.6229 |
94
+ | 0.6568 | 46.0 | 1932 | 0.6221 |
95
+ | 0.6568 | 47.0 | 1974 | 0.6202 |
96
+ | 0.602 | 48.0 | 2016 | 0.6193 |
97
+ | 0.602 | 49.0 | 2058 | 0.6196 |
98
+ | 0.602 | 50.0 | 2100 | 0.6174 |
99
+ | 0.602 | 51.0 | 2142 | 0.6175 |
100
+ | 0.602 | 52.0 | 2184 | 0.6162 |
101
+ | 0.602 | 53.0 | 2226 | 0.6155 |
102
+ | 0.602 | 54.0 | 2268 | 0.6129 |
103
+ | 0.602 | 55.0 | 2310 | 0.6139 |
104
+ | 0.602 | 56.0 | 2352 | 0.6124 |
105
+ | 0.602 | 57.0 | 2394 | 0.6128 |
106
+ | 0.602 | 58.0 | 2436 | 0.6109 |
107
+ | 0.602 | 59.0 | 2478 | 0.6111 |
108
+ | 0.5653 | 60.0 | 2520 | 0.6097 |
109
+ | 0.5653 | 61.0 | 2562 | 0.6086 |
110
+ | 0.5653 | 62.0 | 2604 | 0.6083 |
111
+ | 0.5653 | 63.0 | 2646 | 0.6086 |
112
+ | 0.5653 | 64.0 | 2688 | 0.6090 |
113
+ | 0.5653 | 65.0 | 2730 | 0.6074 |
114
+ | 0.5653 | 66.0 | 2772 | 0.6064 |
115
+ | 0.5653 | 67.0 | 2814 | 0.6056 |
116
+ | 0.5653 | 68.0 | 2856 | 0.6039 |
117
+ | 0.5653 | 69.0 | 2898 | 0.6051 |
118
+ | 0.5653 | 70.0 | 2940 | 0.6043 |
119
+ | 0.5653 | 71.0 | 2982 | 0.6034 |
120
+ | 0.5368 | 72.0 | 3024 | 0.6020 |
121
+ | 0.5368 | 73.0 | 3066 | 0.6047 |
122
+ | 0.5368 | 74.0 | 3108 | 0.6031 |
123
+ | 0.5368 | 75.0 | 3150 | 0.6011 |
124
+ | 0.5368 | 76.0 | 3192 | 0.6027 |
125
+ | 0.5368 | 77.0 | 3234 | 0.6009 |
126
+ | 0.5368 | 78.0 | 3276 | 0.6003 |
127
+ | 0.5368 | 79.0 | 3318 | 0.6001 |
128
+ | 0.5368 | 80.0 | 3360 | 0.6008 |
129
+ | 0.5368 | 81.0 | 3402 | 0.6005 |
130
+ | 0.5368 | 82.0 | 3444 | 0.6007 |
131
+ | 0.5368 | 83.0 | 3486 | 0.5988 |
132
+ | 0.5055 | 84.0 | 3528 | 0.5991 |
133
+ | 0.5055 | 85.0 | 3570 | 0.6004 |
134
+ | 0.5055 | 86.0 | 3612 | 0.5989 |
135
+ | 0.5055 | 87.0 | 3654 | 0.5975 |
136
+ | 0.5055 | 88.0 | 3696 | 0.5977 |
137
+ | 0.5055 | 89.0 | 3738 | 0.5982 |
138
+ | 0.5055 | 90.0 | 3780 | 0.5964 |
139
+ | 0.5055 | 91.0 | 3822 | 0.5979 |
140
+ | 0.5055 | 92.0 | 3864 | 0.5996 |
141
+ | 0.5055 | 93.0 | 3906 | 0.5936 |
142
+ | 0.5055 | 94.0 | 3948 | 0.5956 |
143
+ | 0.5055 | 95.0 | 3990 | 0.5940 |
144
+ | 0.4866 | 96.0 | 4032 | 0.5961 |
145
+ | 0.4866 | 97.0 | 4074 | 0.5955 |
146
+ | 0.4866 | 98.0 | 4116 | 0.5949 |
147
+ | 0.4866 | 99.0 | 4158 | 0.5971 |
148
+ | 0.4866 | 100.0 | 4200 | 0.5958 |
149
+ | 0.4866 | 101.0 | 4242 | 0.5978 |
150
+ | 0.4866 | 102.0 | 4284 | 0.5971 |
151
+ | 0.4866 | 103.0 | 4326 | 0.5954 |
152
+ | 0.4866 | 104.0 | 4368 | 0.5933 |
153
+ | 0.4866 | 105.0 | 4410 | 0.5944 |
154
+ | 0.4866 | 106.0 | 4452 | 0.5952 |
155
+ | 0.4866 | 107.0 | 4494 | 0.5948 |
156
+ | 0.4657 | 108.0 | 4536 | 0.5951 |
157
+ | 0.4657 | 109.0 | 4578 | 0.5948 |
158
+ | 0.4657 | 110.0 | 4620 | 0.5948 |
159
+ | 0.4657 | 111.0 | 4662 | 0.5927 |
160
+ | 0.4657 | 112.0 | 4704 | 0.5931 |
161
+ | 0.4657 | 113.0 | 4746 | 0.5919 |
162
+ | 0.4657 | 114.0 | 4788 | 0.5939 |
163
+ | 0.4657 | 115.0 | 4830 | 0.5922 |
164
+ | 0.4657 | 116.0 | 4872 | 0.5921 |
165
+ | 0.4657 | 117.0 | 4914 | 0.5917 |
166
+ | 0.4657 | 118.0 | 4956 | 0.5913 |
167
+ | 0.4657 | 119.0 | 4998 | 0.5908 |
168
+ | 0.4468 | 120.0 | 5040 | 0.5929 |
169
+ | 0.4468 | 121.0 | 5082 | 0.5915 |
170
+ | 0.4468 | 122.0 | 5124 | 0.5926 |
171
+ | 0.4468 | 123.0 | 5166 | 0.5929 |
172
+ | 0.4468 | 124.0 | 5208 | 0.5911 |
173
+ | 0.4468 | 125.0 | 5250 | 0.5907 |
174
+ | 0.4468 | 126.0 | 5292 | 0.5921 |
175
+ | 0.4468 | 127.0 | 5334 | 0.5917 |
176
+ | 0.4468 | 128.0 | 5376 | 0.5923 |
177
+ | 0.4468 | 129.0 | 5418 | 0.5912 |
178
+ | 0.4468 | 130.0 | 5460 | 0.5930 |
179
+ | 0.4346 | 131.0 | 5502 | 0.5924 |
180
+ | 0.4346 | 132.0 | 5544 | 0.5933 |
181
+ | 0.4346 | 133.0 | 5586 | 0.5920 |
182
+ | 0.4346 | 134.0 | 5628 | 0.5937 |
183
+ | 0.4346 | 135.0 | 5670 | 0.5930 |
184
+ | 0.4346 | 136.0 | 5712 | 0.5930 |
185
+ | 0.4346 | 137.0 | 5754 | 0.5929 |
186
+ | 0.4346 | 138.0 | 5796 | 0.5916 |
187
+ | 0.4346 | 139.0 | 5838 | 0.5935 |
188
+ | 0.4346 | 140.0 | 5880 | 0.5947 |
189
+ | 0.4346 | 141.0 | 5922 | 0.5926 |
190
+ | 0.4346 | 142.0 | 5964 | 0.5930 |
191
+ | 0.4247 | 143.0 | 6006 | 0.5911 |
192
+ | 0.4247 | 144.0 | 6048 | 0.5916 |
193
+ | 0.4247 | 145.0 | 6090 | 0.5929 |
194
+ | 0.4247 | 146.0 | 6132 | 0.5926 |
195
+ | 0.4247 | 147.0 | 6174 | 0.5917 |
196
+ | 0.4247 | 148.0 | 6216 | 0.5913 |
197
+ | 0.4247 | 149.0 | 6258 | 0.5907 |
198
+ | 0.4247 | 150.0 | 6300 | 0.5930 |
199
+ | 0.4247 | 151.0 | 6342 | 0.5928 |
200
+ | 0.4247 | 152.0 | 6384 | 0.5922 |
201
+ | 0.4247 | 153.0 | 6426 | 0.5921 |
202
+ | 0.4247 | 154.0 | 6468 | 0.5925 |
203
+ | 0.4139 | 155.0 | 6510 | 0.5923 |
204
+ | 0.4139 | 156.0 | 6552 | 0.5919 |
205
+ | 0.4139 | 157.0 | 6594 | 0.5920 |
206
+ | 0.4139 | 158.0 | 6636 | 0.5935 |
207
+ | 0.4139 | 159.0 | 6678 | 0.5926 |
208
+ | 0.4139 | 160.0 | 6720 | 0.5926 |
209
+ | 0.4139 | 161.0 | 6762 | 0.5925 |
210
+ | 0.4139 | 162.0 | 6804 | 0.5927 |
211
+ | 0.4139 | 163.0 | 6846 | 0.5918 |
212
+ | 0.4139 | 164.0 | 6888 | 0.5925 |
213
+ | 0.4139 | 165.0 | 6930 | 0.5935 |
214
+ | 0.4139 | 166.0 | 6972 | 0.5926 |
215
+ | 0.4049 | 167.0 | 7014 | 0.5919 |
216
+ | 0.4049 | 168.0 | 7056 | 0.5917 |
217
+ | 0.4049 | 169.0 | 7098 | 0.5916 |
218
+ | 0.4049 | 170.0 | 7140 | 0.5925 |
219
+ | 0.4049 | 171.0 | 7182 | 0.5931 |
220
+ | 0.4049 | 172.0 | 7224 | 0.5938 |
221
+ | 0.4049 | 173.0 | 7266 | 0.5932 |
222
+ | 0.4049 | 174.0 | 7308 | 0.5927 |
223
+ | 0.4049 | 175.0 | 7350 | 0.5934 |
224
+ | 0.4049 | 176.0 | 7392 | 0.5931 |
225
+ | 0.4049 | 177.0 | 7434 | 0.5937 |
226
+ | 0.4049 | 178.0 | 7476 | 0.5939 |
227
+ | 0.397 | 179.0 | 7518 | 0.5939 |
228
+ | 0.397 | 180.0 | 7560 | 0.5932 |
229
+ | 0.397 | 181.0 | 7602 | 0.5935 |
230
+ | 0.397 | 182.0 | 7644 | 0.5939 |
231
+ | 0.397 | 183.0 | 7686 | 0.5935 |
232
+ | 0.397 | 184.0 | 7728 | 0.5945 |
233
+ | 0.397 | 185.0 | 7770 | 0.5932 |
234
+ | 0.397 | 186.0 | 7812 | 0.5931 |
235
+ | 0.397 | 187.0 | 7854 | 0.5925 |
236
+ | 0.397 | 188.0 | 7896 | 0.5934 |
237
+ | 0.397 | 189.0 | 7938 | 0.5941 |
238
+ | 0.397 | 190.0 | 7980 | 0.5939 |
239
+ | 0.3891 | 191.0 | 8022 | 0.5933 |
240
+ | 0.3891 | 192.0 | 8064 | 0.5934 |
241
+ | 0.3891 | 193.0 | 8106 | 0.5938 |
242
+ | 0.3891 | 194.0 | 8148 | 0.5944 |
243
+ | 0.3891 | 195.0 | 8190 | 0.5937 |
244
+ | 0.3891 | 196.0 | 8232 | 0.5939 |
245
+ | 0.3891 | 197.0 | 8274 | 0.5937 |
246
+ | 0.3891 | 198.0 | 8316 | 0.5947 |
247
+ | 0.3891 | 199.0 | 8358 | 0.5945 |
248
+ | 0.3891 | 200.0 | 8400 | 0.5946 |
249
+ | 0.3891 | 201.0 | 8442 | 0.5945 |
250
+ | 0.3891 | 202.0 | 8484 | 0.5938 |
251
+ | 0.3842 | 203.0 | 8526 | 0.5947 |
252
+ | 0.3842 | 204.0 | 8568 | 0.5945 |
253
+ | 0.3842 | 205.0 | 8610 | 0.5935 |
254
+ | 0.3842 | 206.0 | 8652 | 0.5935 |
255
+ | 0.3842 | 207.0 | 8694 | 0.5939 |
256
+ | 0.3842 | 208.0 | 8736 | 0.5938 |
257
+ | 0.3842 | 209.0 | 8778 | 0.5939 |
258
+ | 0.3842 | 210.0 | 8820 | 0.5940 |
259
+ | 0.3842 | 211.0 | 8862 | 0.5943 |
260
+ | 0.3842 | 212.0 | 8904 | 0.5943 |
261
+ | 0.3842 | 213.0 | 8946 | 0.5946 |
262
+ | 0.3842 | 214.0 | 8988 | 0.5946 |
263
+ | 0.3802 | 215.0 | 9030 | 0.5947 |
264
+ | 0.3802 | 216.0 | 9072 | 0.5949 |
265
+ | 0.3802 | 217.0 | 9114 | 0.5944 |
266
+ | 0.3802 | 218.0 | 9156 | 0.5946 |
267
+ | 0.3802 | 219.0 | 9198 | 0.5950 |
268
+ | 0.3802 | 220.0 | 9240 | 0.5950 |
269
+ | 0.3802 | 221.0 | 9282 | 0.5953 |
270
+ | 0.3802 | 222.0 | 9324 | 0.5951 |
271
+ | 0.3802 | 223.0 | 9366 | 0.5956 |
272
+ | 0.3802 | 224.0 | 9408 | 0.5952 |
273
+ | 0.3802 | 225.0 | 9450 | 0.5955 |
274
+ | 0.3802 | 226.0 | 9492 | 0.5958 |
275
+ | 0.3791 | 227.0 | 9534 | 0.5954 |
276
+ | 0.3791 | 228.0 | 9576 | 0.5953 |
277
+ | 0.3791 | 229.0 | 9618 | 0.5959 |
278
+ | 0.3791 | 230.0 | 9660 | 0.5959 |
279
+ | 0.3791 | 231.0 | 9702 | 0.5957 |
280
+ | 0.3791 | 232.0 | 9744 | 0.5957 |
281
+ | 0.3791 | 233.0 | 9786 | 0.5956 |
282
+ | 0.3791 | 234.0 | 9828 | 0.5956 |
283
+ | 0.3791 | 235.0 | 9870 | 0.5956 |
284
+ | 0.3791 | 236.0 | 9912 | 0.5956 |
285
+ | 0.3791 | 237.0 | 9954 | 0.5957 |
286
+ | 0.3791 | 238.0 | 9996 | 0.5960 |
287
+ | 0.3764 | 239.0 | 10038 | 0.5956 |
288
+ | 0.3764 | 240.0 | 10080 | 0.5956 |
289
+ | 0.3764 | 241.0 | 10122 | 0.5955 |
290
+ | 0.3764 | 242.0 | 10164 | 0.5956 |
291
+ | 0.3764 | 243.0 | 10206 | 0.5955 |
292
+ | 0.3764 | 244.0 | 10248 | 0.5957 |
293
+ | 0.3764 | 245.0 | 10290 | 0.5956 |
294
+ | 0.3764 | 246.0 | 10332 | 0.5955 |
295
+ | 0.3764 | 247.0 | 10374 | 0.5954 |
296
+ | 0.3764 | 248.0 | 10416 | 0.5955 |
297
+ | 0.3764 | 249.0 | 10458 | 0.5954 |
298
+ | 0.3763 | 250.0 | 10500 | 0.5954 |
299
+
300
+
301
+ ### Framework versions
302
+
303
+ - Transformers 4.25.1
304
+ - Pytorch 1.13.0+cu116
305
+ - Datasets 2.7.1
306
+ - Tokenizers 0.13.2