1-800-BAD-CODE
commited on
Commit
•
a392f9c
1
Parent(s):
5548a75
Update README.md
Browse files
README.md
CHANGED
@@ -217,3 +217,56 @@ Given the amount of work required to collect pretty metrics in 47 languages, I'l
|
|
217 |
|
218 |
Expand any of the following tabs to see metrics for that language.
|
219 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
217 |
|
218 |
Expand any of the following tabs to see metrics for that language.
|
219 |
|
220 |
+
|
221 |
+
<details>
|
222 |
+
<summary>English</summary>
|
223 |
+
|
224 |
+
```text
|
225 |
+
punct_post test report:
|
226 |
+
label precision recall f1 support
|
227 |
+
<NULL> (label_id: 0) 99.18 98.47 98.82 538769
|
228 |
+
<ACRONYM> (label_id: 1) 66.03 78.63 71.78 571
|
229 |
+
. (label_id: 2) 90.66 93.68 92.14 30581
|
230 |
+
, (label_id: 3) 74.18 82.93 78.31 23230
|
231 |
+
? (label_id: 4) 78.10 80.08 79.07 1024
|
232 |
+
? (label_id: 5) 0.00 0.00 0.00 0
|
233 |
+
, (label_id: 6) 0.00 0.00 0.00 0
|
234 |
+
。 (label_id: 7) 0.00 0.00 0.00 0
|
235 |
+
、 (label_id: 8) 0.00 0.00 0.00 0
|
236 |
+
・ (label_id: 9) 0.00 0.00 0.00 0
|
237 |
+
। (label_id: 10) 0.00 0.00 0.00 0
|
238 |
+
؟ (label_id: 11) 0.00 0.00 0.00 0
|
239 |
+
، (label_id: 12) 0.00 0.00 0.00 0
|
240 |
+
; (label_id: 13) 0.00 0.00 0.00 0
|
241 |
+
። (label_id: 14) 0.00 0.00 0.00 0
|
242 |
+
፣ (label_id: 15) 0.00 0.00 0.00 0
|
243 |
+
፧ (label_id: 16) 0.00 0.00 0.00 0
|
244 |
+
-------------------
|
245 |
+
micro avg 97.56 97.56 97.56 594175
|
246 |
+
macro avg 81.63 86.76 84.03 594175
|
247 |
+
weighted avg 97.70 97.56 97.62 594175
|
248 |
+
```
|
249 |
+
|
250 |
+
```text
|
251 |
+
cap test report:
|
252 |
+
label precision recall f1 support
|
253 |
+
LOWER (label_id: 0) 99.71 99.85 99.78 2036824
|
254 |
+
UPPER (label_id: 1) 96.40 93.27 94.81 87747
|
255 |
+
-------------------
|
256 |
+
micro avg 99.58 99.58 99.58 2124571
|
257 |
+
macro avg 98.06 96.56 97.30 2124571
|
258 |
+
weighted avg 99.57 99.58 99.58 2124571
|
259 |
+
```
|
260 |
+
|
261 |
+
```text
|
262 |
+
seg test report:
|
263 |
+
label precision recall f1 support
|
264 |
+
NOSTOP (label_id: 0) 99.97 99.98 99.98 564228
|
265 |
+
FULLSTOP (label_id: 1) 99.73 99.54 99.64 32947
|
266 |
+
-------------------
|
267 |
+
micro avg 99.96 99.96 99.96 597175
|
268 |
+
macro avg 99.85 99.76 99.81 597175
|
269 |
+
weighted avg 99.96 99.96 99.96 597175
|
270 |
+
```
|
271 |
+
|
272 |
+
<\details>
|