File size: 152,436 Bytes
78aa4ee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
1373
1374
1375
1376
1377
1378
1379
1380
1381
1382
1383
1384
1385
1386
1387
1388
1389
1390
1391
1392
1393
1394
1395
1396
1397
1398
1399
1400
1401
1402
1403
1404
1405
1406
1407
1408
1409
1410
1411
1412
1413
1414
1415
1416
1417
1418
1419
1420
1421
1422
1423
1424
1425
1426
1427
1428
1429
1430
1431
1432
1433
1434
1435
1436
1437
1438
1439
1440
1441
1442
1443
1444
1445
1446
1447
1448
1449
1450
1451
1452
1453
1454
1455
1456
1457
1458
1459
1460
1461
1462
1463
1464
1465
1466
1467
1468
1469
1470
1471
1472
1473
1474
1475
1476
1477
1478
1479
1480
1481
1482
1483
1484
1485
1486
1487
1488
1489
1490
1491
1492
1493
1494
1495
1496
1497
1498
1499
1500
1501
1502
1503
1504
1505
1506
1507
1508
1509
1510
1511
1512
1513
1514
1515
1516
1517
1518
1519
1520
1521
1522
1523
1524
1525
1526
1527
1528
1529
1530
1531
1532
1533
1534
1535
1536
1537
1538
1539
1540
1541
1542
1543
1544
1545
1546
1547
1548
1549
1550
1551
1552
1553
1554
1555
1556
1557
1558
1559
1560
1561
1562
1563
1564
1565
1566
1567
1568
1569
1570
1571
1572
1573
1574
1575
1576
1577
1578
1579
1580
1581
1582
1583
1584
1585
1586
1587
1588
1589
1590
1591
1592
1593
1594
1595
1596
1597
1598
1599
1600
1601
1602
1603
1604
1605
1606
1607
1608
1609
1610
1611
1612
1613
1614
1615
1616
1617
1618
1619
1620
1621
1622
1623
1624
1625
1626
1627
1628
1629
1630
1631
1632
1633
1634
1635
1636
1637
1638
1639
1640
1641
1642
1643
1644
1645
1646
1647
1648
1649
1650
1651
1652
1653
1654
1655
1656
1657
1658
1659
1660
1661
1662
1663
1664
1665
1666
1667
1668
1669
1670
1671
1672
1673
1674
1675
1676
1677
1678
1679
1680
1681
1682
1683
1684
1685
1686
1687
1688
1689
1690
1691
1692
1693
1694
1695
1696
1697
1698
1699
1700
1701
1702
1703
1704
1705
1706
1707
1708
1709
1710
1711
1712
1713
1714
1715
1716
1717
1718
1719
1720
1721
1722
1723
1724
1725
1726
1727
1728
1729
1730
1731
1732
1733
1734
1735
1736
1737
1738
1739
1740
1741
1742
1743
1744
1745
1746
1747
1748
1749
1750
1751
1752
1753
1754
1755
1756
1757
1758
1759
1760
1761
1762
1763
1764
1765
1766
1767
1768
1769
1770
1771
1772
1773
1774
1775
1776
1777
1778
1779
1780
1781
1782
1783
1784
1785
1786
1787
1788
1789
1790
1791
1792
1793
1794
1795
1796
1797
1798
1799
1800
1801
1802
1803
1804
1805
1806
1807
1808
1809
1810
1811
1812
1813
1814
1815
1816
1817
1818
1819
1820
1821
1822
1823
1824
1825
1826
1827
1828
1829
1830
1831
1832
1833
1834
1835
1836
1837
1838
1839
1840
1841
1842
1843
1844
1845
1846
1847
1848
1849
1850
1851
1852
1853
1854
1855
1856
1857
1858
1859
1860
1861
1862
1863
1864
1865
1866
1867
1868
1869
1870
1871
1872
1873
1874
1875
1876
1877
1878
1879
1880
1881
1882
1883
1884
1885
1886
1887
1888
1889
1890
1891
1892
1893
1894
1895
1896
1897
1898
1899
1900
1901
1902
1903
1904
1905
1906
1907
1908
1909
1910
1911
1912
1913
1914
1915
1916
1917
1918
1919
1920
1921
1922
1923
1924
1925
1926
1927
1928
1929
1930
1931
1932
1933
1934
1935
1936
1937
1938
1939
1940
1941
1942
1943
1944
1945
1946
1947
1948
1949
1950
1951
1952
1953
1954
1955
1956
1957
1958
1959
1960
1961
1962
1963
1964
1965
1966
1967
1968
1969
1970
1971
1972
1973
1974
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
1989
1990
1991
1992
1993
1994
1995
1996
1997
1998
1999
2000
2001
2002
2003
2004
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
2025
2026
2027
2028
2029
2030
2031
2032
2033
2034
2035
2036
2037
2038
2039
2040
2041
2042
2043
2044
2045
2046
2047
2048
2049
2050
2051
2052
2053
2054
2055
2056
2057
2058
2059
2060
2061
2062
2063
2064
2065
2066
2067
2068
2069
2070
2071
2072
2073
2074
2075
2076
2077
2078
2079
2080
2081
2082
2083
2084
2085
2086
2087
2088
2089
2090
2091
2092
2093
2094
2095
2096
2097
2098
2099
2100
2101
2102
2103
2104
2105
2106
2107
2108
2109
2110
2111
2112
2113
2114
2115
2116
2117
2118
2119
2120
2121
2122
2123
2124
2125
2126
2127
2128
2129
2130
2131
2132
2133
2134
2135
2136
2137
2138
2139
2140
2141
2142
2143
2144
2145
2146
2147
2148
2149
2150
2151
2152
2153
2154
2155
2156
2157
2158
2159
2160
2161
2162
2163
2164
2165
2166
2167
2168
2169
2170
2171
2172
2173
2174
2175
2176
2177
2178
2179
2180
2181
2182
2183
2184
2185
2186
2187
2188
2189
2190
2191
2192
2193
2194
2195
2196
2197
2198
2199
2200
2201
2202
2203
2204
2205
2206
2207
2208
2209
2210
2211
2212
2213
2214
2215
2216
2217
2218
2219
2220
2221
2222
2223
2224
2225
2226
2227
2228
2229
2230
2231
2232
2233
2234
2235
2236
2237
2238
2239
2240
2241
2242
2243
2244
2245
2246
2247
2248
2249
2250
2251
2252
2253
2254
2255
2256
2257
2258
2259
2260
2261
2262
2263
2264
2265
2266
2267
2268
2269
2270
2271
2272
2273
2274
2275
2276
2277
2278
2279
2280
2281
2282
2283
2284
2285
2286
2287
2288
2289
2290
2291
2292
2293
2294
2295
2296
2297
2298
2299
2300
2301
2302
2303
2304
2305
2306
2307
2308
2309
2310
2311
2312
2313
2314
2315
2316
2317
2318
2319
2320
2321
2322
2323
2324
2325
2326
2327
2328
2329
2330
2331
2332
2333
2334
2335
2336
2337
2338
2339
2340
2341
2342
2343
2344
2345
2346
2347
2348
2349
2350
2351
2352
2353
2354
2355
2356
2357
2358
2359
2360
2361
2362
2363
2364
2365
2366
2367
2368
2369
2370
2371
2372
2373
2374
2375
2376
2377
2378
2379
2380
2381
2382
2383
2384
2385
2386
2387
2388
2389
2390
2391
2392
2393
2394
2395
2396
2397
2398
2399
2400
2401
2402
2403
2404
2405
2406
2407
2408
2409
2410
2411
2412
2413
2414
2415
2416
2417
2418
2419
2420
2421
2422
2423
2424
2425
2426
2427
2428
2429
{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "colab": {
      "name": "kmb_baseline.ipynb",
      "provenance": [],
      "collapsed_sections": [],
      "toc_visible": true
    },
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    },
    "accelerator": "GPU"
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "dK0TQmS_OT_g",
        "colab_type": "text"
      },
      "source": [
        "# English to Kimbundu Baseline (Masakhane)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "GYLYy2KkOZD3",
        "colab_type": "text"
      },
      "source": [
        "## Dependencies"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "UiGBHYWtOSS9",
        "colab_type": "code",
        "outputId": "6579fb6c-fcb6-47a9-f02f-f075dafc3732",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 555
        }
      },
      "source": [
        "! apt-get install libgoogle-perftools-dev libsparsehash-dev"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Reading package lists... Done\n",
            "Building dependency tree       \n",
            "Reading state information... Done\n",
            "The following package was automatically installed and is no longer required:\n",
            "  libnvidia-common-430\n",
            "Use 'apt autoremove' to remove it.\n",
            "The following additional packages will be installed:\n",
            "  libunwind-dev\n",
            "The following NEW packages will be installed:\n",
            "  libgoogle-perftools-dev libsparsehash-dev libunwind-dev\n",
            "0 upgraded, 3 newly installed, 0 to remove and 16 not upgraded.\n",
            "Need to get 699 kB of archives.\n",
            "After this operation, 7,374 kB of additional disk space will be used.\n",
            "Get:1 http://archive.ubuntu.com/ubuntu bionic/main amd64 libunwind-dev amd64 1.2.1-8 [423 kB]\n",
            "Get:2 http://archive.ubuntu.com/ubuntu bionic/main amd64 libgoogle-perftools-dev amd64 2.5-2.2ubuntu3 [204 kB]\n",
            "Get:3 http://archive.ubuntu.com/ubuntu bionic/universe amd64 libsparsehash-dev all 2.0.2-1 [72.4 kB]\n",
            "Fetched 699 kB in 1s (750 kB/s)\n",
            "Selecting previously unselected package libunwind-dev:amd64.\n",
            "(Reading database ... 145155 files and directories currently installed.)\n",
            "Preparing to unpack .../libunwind-dev_1.2.1-8_amd64.deb ...\n",
            "Unpacking libunwind-dev:amd64 (1.2.1-8) ...\n",
            "Selecting previously unselected package libgoogle-perftools-dev.\n",
            "Preparing to unpack .../libgoogle-perftools-dev_2.5-2.2ubuntu3_amd64.deb ...\n",
            "Unpacking libgoogle-perftools-dev (2.5-2.2ubuntu3) ...\n",
            "Selecting previously unselected package libsparsehash-dev.\n",
            "Preparing to unpack .../libsparsehash-dev_2.0.2-1_all.deb ...\n",
            "Unpacking libsparsehash-dev (2.0.2-1) ...\n",
            "Setting up libsparsehash-dev (2.0.2-1) ...\n",
            "Setting up libunwind-dev:amd64 (1.2.1-8) ...\n",
            "Setting up libgoogle-perftools-dev (2.5-2.2ubuntu3) ...\n",
            "Processing triggers for man-db (2.8.3-2ubuntu0.1) ...\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "GyZDwcUgOlZo",
        "colab_type": "code",
        "outputId": "f5d67e2a-bb87-4afc-a164-6ee8b3d1afa5",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 139
        }
      },
      "source": [
        "! git clone https://github.com/clab/fast_align.git"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Cloning into 'fast_align'...\n",
            "remote: Enumerating objects: 9, done.\u001b[K\n",
            "remote: Counting objects: 100% (9/9), done.\u001b[K\n",
            "remote: Compressing objects: 100% (7/7), done.\u001b[K\n",
            "remote: Total 213 (delta 2), reused 4 (delta 2), pack-reused 204\u001b[K\n",
            "Receiving objects: 100% (213/213), 70.68 KiB | 3.07 MiB/s, done.\n",
            "Resolving deltas: 100% (110/110), done.\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "vrcmfy77Or0Z",
        "colab_type": "code",
        "outputId": "5d4d3ef5-c478-4caa-bc4e-69f23c8fad58",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 503
        }
      },
      "source": [
        "! cd fast_align && mkdir build && cd build && cmake .. && make"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "-- The C compiler identification is GNU 7.4.0\n",
            "-- The CXX compiler identification is GNU 7.4.0\n",
            "-- Check for working C compiler: /usr/bin/cc\n",
            "-- Check for working C compiler: /usr/bin/cc -- works\n",
            "-- Detecting C compiler ABI info\n",
            "-- Detecting C compiler ABI info - done\n",
            "-- Detecting C compile features\n",
            "-- Detecting C compile features - done\n",
            "-- Check for working CXX compiler: /usr/bin/c++\n",
            "-- Check for working CXX compiler: /usr/bin/c++ -- works\n",
            "-- Detecting CXX compiler ABI info\n",
            "-- Detecting CXX compiler ABI info - done\n",
            "-- Detecting CXX compile features\n",
            "-- Detecting CXX compile features - done\n",
            "-- Found SparseHash: /usr/include  \n",
            "-- Configuring done\n",
            "-- Generating done\n",
            "-- Build files have been written to: /content/fast_align/build\n",
            "\u001b[35m\u001b[1mScanning dependencies of target atools\u001b[0m\n",
            "[ 16%] \u001b[32mBuilding CXX object CMakeFiles/atools.dir/src/alignment_io.cc.o\u001b[0m\n",
            "[ 33%] \u001b[32mBuilding CXX object CMakeFiles/atools.dir/src/atools.cc.o\u001b[0m\n",
            "[ 50%] \u001b[32m\u001b[1mLinking CXX executable atools\u001b[0m\n",
            "[ 50%] Built target atools\n",
            "\u001b[35m\u001b[1mScanning dependencies of target fast_align\u001b[0m\n",
            "[ 66%] \u001b[32mBuilding CXX object CMakeFiles/fast_align.dir/src/fast_align.cc.o\u001b[0m\n",
            "[ 83%] \u001b[32mBuilding CXX object CMakeFiles/fast_align.dir/src/ttables.cc.o\u001b[0m\n",
            "[100%] \u001b[32m\u001b[1mLinking CXX executable fast_align\u001b[0m\n",
            "[100%] Built target fast_align\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "sJ2LEkCfO4Gx",
        "colab_type": "code",
        "outputId": "4552ff25-9464-4371-f416-f2f4ea60a2a4",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 124
        }
      },
      "source": [
        "! pip install opustools-pkg"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Collecting opustools-pkg\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/6c/9f/e829a0cceccc603450cd18e1ff80807b6237a88d9a8df2c0bb320796e900/opustools_pkg-0.0.52-py3-none-any.whl (80kB)\n",
            "\r\u001b[K     |████                            | 10kB 29.7MB/s eta 0:00:01\r\u001b[K     |████████                        | 20kB 2.2MB/s eta 0:00:01\r\u001b[K     |████████████▏                   | 30kB 3.2MB/s eta 0:00:01\r\u001b[K     |████████████████▏               | 40kB 2.1MB/s eta 0:00:01\r\u001b[K     |████████████████████▎           | 51kB 2.6MB/s eta 0:00:01\r\u001b[K     |████████████████████████▎       | 61kB 3.1MB/s eta 0:00:01\r\u001b[K     |████████████████████████████▎   | 71kB 3.6MB/s eta 0:00:01\r\u001b[K     |████████████████████████████████| 81kB 3.2MB/s \n",
            "\u001b[?25hInstalling collected packages: opustools-pkg\n",
            "Successfully installed opustools-pkg-0.0.52\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "_m9P3UU0PAXr",
        "colab_type": "code",
        "outputId": "89b37029-dee2-4ca7-e3c9-c08f45d9994d",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "! git clone https://github.com/joeynmt/joeynmt.git\n",
        "! cd joeynmt; pip3 install ."
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Cloning into 'joeynmt'...\n",
            "remote: Enumerating objects: 84, done.\u001b[K\n",
            "remote: Counting objects: 100% (84/84), done.\u001b[K\n",
            "remote: Compressing objects: 100% (59/59), done.\u001b[K\n",
            "remote: Total 2268 (delta 50), reused 44 (delta 25), pack-reused 2184\u001b[K\n",
            "Receiving objects: 100% (2268/2268), 2.63 MiB | 17.18 MiB/s, done.\n",
            "Resolving deltas: 100% (1571/1571), done.\n",
            "Processing /content/joeynmt\n",
            "Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.16.0)\n",
            "Requirement already satisfied: pillow in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (6.2.2)\n",
            "Requirement already satisfied: numpy<2.0,>=1.14.5 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.17.5)\n",
            "Requirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (45.1.0)\n",
            "Requirement already satisfied: torch>=1.1 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.4.0)\n",
            "Requirement already satisfied: tensorflow>=1.14 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.15.0)\n",
            "Requirement already satisfied: torchtext in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.3.1)\n",
            "Collecting sacrebleu>=1.3.6\n",
            "  Downloading https://files.pythonhosted.org/packages/45/31/1a135b964c169984b27fb2f7a50280fa7f8e6d9d404d8a9e596180487fd1/sacrebleu-1.4.3-py3-none-any.whl\n",
            "Collecting subword-nmt\n",
            "  Downloading https://files.pythonhosted.org/packages/74/60/6600a7bc09e7ab38bc53a48a20d8cae49b837f93f5842a41fe513a694912/subword_nmt-0.3.7-py2.py3-none-any.whl\n",
            "Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (3.1.2)\n",
            "Requirement already satisfied: seaborn in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.9.1)\n",
            "Collecting pyyaml>=5.1\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/3d/d9/ea9816aea31beeadccd03f1f8b625ecf8f645bd66744484d162d84803ce5/PyYAML-5.3.tar.gz (268kB)\n",
            "\u001b[K     |████████████████████████████████| 276kB 8.1MB/s \n",
            "\u001b[?25hCollecting pylint\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e9/59/43fc36c5ee316bb9aeb7cf5329cdbdca89e5749c34d5602753827c0aa2dc/pylint-2.4.4-py3-none-any.whl (302kB)\n",
            "\u001b[K     |████████████████████████████████| 307kB 14.8MB/s \n",
            "\u001b[?25hRequirement already satisfied: six==1.12 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.12.0)\n",
            "Requirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.1.0)\n",
            "Requirement already satisfied: gast==0.2.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.2.2)\n",
            "Requirement already satisfied: google-pasta>=0.1.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.1.8)\n",
            "Requirement already satisfied: wrapt>=1.11.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.11.2)\n",
            "Requirement already satisfied: absl-py>=0.7.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.9.0)\n",
            "Requirement already satisfied: keras-applications>=1.0.8 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.0.8)\n",
            "Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.34.2)\n",
            "Requirement already satisfied: tensorflow-estimator==1.15.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.1)\n",
            "Requirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.10.0)\n",
            "Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.1.0)\n",
            "Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.1.0)\n",
            "Requirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.8.1)\n",
            "Requirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.0)\n",
            "Requirement already satisfied: tensorboard<1.16.0,>=1.15.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.0)\n",
            "Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (2.21.0)\n",
            "Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (4.28.1)\n",
            "Requirement already satisfied: typing in /usr/local/lib/python3.6/dist-packages (from sacrebleu>=1.3.6->joeynmt==0.0.1) (3.6.6)\n",
            "Collecting portalocker\n",
            "  Downloading https://files.pythonhosted.org/packages/91/db/7bc703c0760df726839e0699b7f78a4d8217fdc9c7fcb1b51b39c5a22a4e/portalocker-1.5.2-py2.py3-none-any.whl\n",
            "Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.6.1)\n",
            "Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (1.1.0)\n",
            "Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (0.10.0)\n",
            "Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.4.6)\n",
            "Requirement already satisfied: scipy>=0.17.1 in /usr/local/lib/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (1.4.1)\n",
            "Requirement already satisfied: pandas>=0.17.1 in /usr/local/lib/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (0.25.3)\n",
            "Collecting mccabe<0.7,>=0.6\n",
            "  Downloading https://files.pythonhosted.org/packages/87/89/479dc97e18549e21354893e4ee4ef36db1d237534982482c3681ee6e7b57/mccabe-0.6.1-py2.py3-none-any.whl\n",
            "Collecting astroid<2.4,>=2.3.0\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/ad/ae/86734823047962e7b8c8529186a1ac4a7ca19aaf1aa0c7713c022ef593fd/astroid-2.3.3-py3-none-any.whl (205kB)\n",
            "\u001b[K     |████████████████████████████████| 215kB 18.1MB/s \n",
            "\u001b[?25hCollecting isort<5,>=4.2.5\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/e5/b0/c121fd1fa3419ea9bfd55c7f9c4fedfec5143208d8c7ad3ce3db6c623c21/isort-4.3.21-py2.py3-none-any.whl (42kB)\n",
            "\u001b[K     |████████████████████████████████| 51kB 7.4MB/s \n",
            "\u001b[?25hRequirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from keras-applications>=1.0.8->tensorflow>=1.14->joeynmt==0.0.1) (2.8.0)\n",
            "Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/dist-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow>=1.14->joeynmt==0.0.1) (0.16.1)\n",
            "Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/dist-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow>=1.14->joeynmt==0.0.1) (3.1.1)\n",
            "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (1.24.3)\n",
            "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (3.0.4)\n",
            "Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2.8)\n",
            "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2019.11.28)\n",
            "Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.17.1->seaborn->joeynmt==0.0.1) (2018.9)\n",
            "Collecting typed-ast<1.5,>=1.4.0; implementation_name == \"cpython\" and python_version < \"3.8\"\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/90/ed/5459080d95eb87a02fe860d447197be63b6e2b5e9ff73c2b0a85622994f4/typed_ast-1.4.1-cp36-cp36m-manylinux1_x86_64.whl (737kB)\n",
            "\u001b[K     |████████████████████████████████| 747kB 19.9MB/s \n",
            "\u001b[?25hCollecting lazy-object-proxy==1.4.*\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/0b/dd/b1e3407e9e6913cf178e506cd0dee818e58694d9a5cd1984e3f6a8b9a10f/lazy_object_proxy-1.4.3-cp36-cp36m-manylinux1_x86_64.whl (55kB)\n",
            "\u001b[K     |████████████████████████████████| 61kB 9.0MB/s \n",
            "\u001b[?25hBuilding wheels for collected packages: joeynmt, pyyaml\n",
            "  Building wheel for joeynmt (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for joeynmt: filename=joeynmt-0.0.1-cp36-none-any.whl size=73017 sha256=21d2b5093d74cba0354895c618fae30fc41d9a1a2415f6889ab20ac1f2f0bad6\n",
            "  Stored in directory: /tmp/pip-ephem-wheel-cache-zyrii7fm/wheels/db/01/db/751cc9f3e7f6faec127c43644ba250a3ea7ad200594aeda70a\n",
            "  Building wheel for pyyaml (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for pyyaml: filename=PyYAML-5.3-cp36-cp36m-linux_x86_64.whl size=44229 sha256=47a30aa84821b32517dc90994753e491af88e658d419f1b672ae53ccae2d7af2\n",
            "  Stored in directory: /root/.cache/pip/wheels/e4/76/4d/a95b8dd7b452b69e8ed4f68b69e1b55e12c9c9624dd962b191\n",
            "Successfully built joeynmt pyyaml\n",
            "Installing collected packages: portalocker, sacrebleu, subword-nmt, pyyaml, mccabe, typed-ast, lazy-object-proxy, astroid, isort, pylint, joeynmt\n",
            "  Found existing installation: PyYAML 3.13\n",
            "    Uninstalling PyYAML-3.13:\n",
            "      Successfully uninstalled PyYAML-3.13\n",
            "Successfully installed astroid-2.3.3 isort-4.3.21 joeynmt-0.0.1 lazy-object-proxy-1.4.3 mccabe-0.6.1 portalocker-1.5.2 pylint-2.4.4 pyyaml-5.3 sacrebleu-1.4.3 subword-nmt-0.3.7 typed-ast-1.4.1\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "0Y1lkrt2peIt",
        "colab_type": "code",
        "outputId": "6a42b2d6-cabb-4d85-c6d8-0b9cb712d1a0",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 106
        }
      },
      "source": [
        "! pip install fuzzywuzzy"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Collecting fuzzywuzzy\n",
            "  Downloading https://files.pythonhosted.org/packages/d8/f1/5a267addb30ab7eaa1beab2b9323073815da4551076554ecc890a3595ec9/fuzzywuzzy-0.17.0-py2.py3-none-any.whl\n",
            "Installing collected packages: fuzzywuzzy\n",
            "Successfully installed fuzzywuzzy-0.17.0\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Zt4JPGlNpmmG",
        "colab_type": "code",
        "outputId": "1e32e545-3c68-4154-ef57-94380bf1144a",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 228
        }
      },
      "source": [
        "! pip install python-Levenshtein"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Collecting python-Levenshtein\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/42/a9/d1785c85ebf9b7dfacd08938dd028209c34a0ea3b1bcdb895208bd40a67d/python-Levenshtein-0.12.0.tar.gz (48kB)\n",
            "\r\u001b[K     |██████▊                         | 10kB 31.1MB/s eta 0:00:01\r\u001b[K     |█████████████▌                  | 20kB 2.1MB/s eta 0:00:01\r\u001b[K     |████████████████████▏           | 30kB 2.7MB/s eta 0:00:01\r\u001b[K     |███████████████████████████     | 40kB 2.0MB/s eta 0:00:01\r\u001b[K     |████████████████████████████████| 51kB 2.1MB/s \n",
            "\u001b[?25hRequirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from python-Levenshtein) (45.1.0)\n",
            "Building wheels for collected packages: python-Levenshtein\n",
            "  Building wheel for python-Levenshtein (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for python-Levenshtein: filename=python_Levenshtein-0.12.0-cp36-cp36m-linux_x86_64.whl size=144667 sha256=3c10c6cb5f031cdd3b567ad1ca99ded89d6e20b1928a0e7aca1ac93a8561b2af\n",
            "  Stored in directory: /root/.cache/pip/wheels/de/c2/93/660fd5f7559049268ad2dc6d81c4e39e9e36518766eaf7e342\n",
            "Successfully built python-Levenshtein\n",
            "Installing collected packages: python-Levenshtein\n",
            "Successfully installed python-Levenshtein-0.12.0\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "akiy3TCiQgkP",
        "colab_type": "text"
      },
      "source": [
        "## Imports"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Y4YkB1RkQiAv",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "from os import path\n",
        "import os\n",
        "import time\n",
        "\n",
        "import pandas as pd\n",
        "import numpy as np\n",
        "from nltk.tokenize import TreebankWordTokenizer\n",
        "from fuzzywuzzy import process"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "61_N3-mYPRY9",
        "colab_type": "text"
      },
      "source": [
        "## Data Gathering"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "EkW-mUdvQ1eY",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "source_language = 'en'\n",
        "target_language = 'kmb'\n",
        "os.environ[\"data_path\"] = path.join(\"joeynmt\", \"data\", source_language + target_language) \n",
        "os.environ[\"src\"] = source_language \n",
        "os.environ[\"tgt\"] = target_language"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "stPP3nXaQmK3",
        "colab_type": "code",
        "outputId": "3c6e86e4-faf3-4960-af3f-f7285d3983d7",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 355
        }
      },
      "source": [
        "# JW300 data\n",
        "! opus_read -d JW300 -s $tgt -t $src -wm moses -w jw300.$tgt jw300.$src -q\n",
        "\n",
        "source = []\n",
        "target = []\n",
        "with open('jw300.' + source_language) as f:\n",
        "  for _, line in enumerate(f):\n",
        "    source.append(line.strip())\n",
        "with open('jw300.' + target_language) as f:\n",
        "  for _, line in enumerate(f):\n",
        "    target.append(line.strip())\n",
        "\n",
        "jw300_raw = []\n",
        "for idx, line in enumerate(source):\n",
        "  if len(line) > 2:\n",
        "    if len(target[idx]) > 2:\n",
        "      jw300_raw.append([line, target[idx]])\n",
        "\n",
        "jw300 = pd.DataFrame(jw300_raw, columns=['source_sentence', 'target_sentence'])\n",
        "jw300.head(3)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\n",
            "Alignment file /proj/nlpl/data/OPUS/JW300/latest/xml/en-kmb.xml.gz not found. The following files are available for downloading:\n",
            "\n",
            " 920 KB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en-kmb.xml.gz\n",
            " 263 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en.zip\n",
            "  10 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/kmb.zip\n",
            "\n",
            " 274 MB Total size\n",
            "./JW300_latest_xml_en-kmb.xml.gz ... 100% of 920 KB\n",
            "./JW300_latest_xml_en.zip ... 100% of 263 MB\n",
            "./JW300_latest_xml_kmb.zip ... 100% of 10 MB\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>source_sentence</th>\n",
              "      <th>target_sentence</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>Table of Contents</td>\n",
              "      <td>Iala – mu</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>December 1 , 2010</td>\n",
              "      <td>1 Ua Katatu Ua 2011</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>Who Inhabit the Spirit Realm ?</td>\n",
              "      <td>O Kuiala ku Diulu Kuene Muene Athu mu Nzumbi</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                  source_sentence                               target_sentence\n",
              "0               Table of Contents                                     Iala – mu\n",
              "1               December 1 , 2010                           1 Ua Katatu Ua 2011\n",
              "2  Who Inhabit the Spirit Realm ?  O Kuiala ku Diulu Kuene Muene Athu mu Nzumbi"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 10
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "IUJ6tYVWSXzd",
        "colab_type": "code",
        "outputId": "ca3bd0d1-0b2a-406f-ec92-ec802f8c888b",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 526
        }
      },
      "source": [
        "# Common test data\n",
        "source_test_file = 'test.en-' + target_language + '.en'\n",
        "target_test_file = 'test.en-' + target_language + '.' + target_language\n",
        "\n",
        "! wget https://raw.githubusercontent.com/jaderabbit/masakhane/master/jw300_utils/test/test.en-$tgt.en\n",
        "! wget https://raw.githubusercontent.com/jaderabbit/masakhane/master/jw300_utils/test/test.en-$tgt.$tgt\n",
        "\n",
        "source = []\n",
        "target = []\n",
        "with open(source_test_file) as f:\n",
        "  for _, line in enumerate(f):\n",
        "    source.append(line.strip())\n",
        "with open(target_test_file) as f:\n",
        "  for _, line in enumerate(f):\n",
        "    target.append(line.strip())\n",
        "\n",
        "! rm test.en-$tgt.en\n",
        "! rm test.en-$tgt.$tgt\n",
        "\n",
        "test_raw = []\n",
        "for idx, line in enumerate(source):\n",
        "  if len(line) > 2:\n",
        "    if len(target[idx]) > 2:\n",
        "      test_raw.append([line, target[idx]])\n",
        "\n",
        "df_test = pd.DataFrame(test_raw, columns=['source_sentence', 'target_sentence'])\n",
        "df_test.head(3)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "--2020-02-04 19:04:17--  https://raw.githubusercontent.com/jaderabbit/masakhane/master/jw300_utils/test/test.en-kmb.en\n",
            "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n",
            "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n",
            "HTTP request sent, awaiting response... 200 OK\n",
            "Length: 204945 (200K) [text/plain]\n",
            "Saving to: ‘test.en-kmb.en’\n",
            "\n",
            "\rtest.en-kmb.en        0%[                    ]       0  --.-KB/s               \rtest.en-kmb.en      100%[===================>] 200.14K  --.-KB/s    in 0.03s   \n",
            "\n",
            "2020-02-04 19:04:17 (6.53 MB/s) - ‘test.en-kmb.en’ saved [204945/204945]\n",
            "\n",
            "--2020-02-04 19:04:18--  https://raw.githubusercontent.com/jaderabbit/masakhane/master/jw300_utils/test/test.en-kmb.kmb\n",
            "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n",
            "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n",
            "HTTP request sent, awaiting response... 200 OK\n",
            "Length: 230668 (225K) [text/plain]\n",
            "Saving to: ‘test.en-kmb.kmb’\n",
            "\n",
            "test.en-kmb.kmb     100%[===================>] 225.26K  --.-KB/s    in 0.04s   \n",
            "\n",
            "2020-02-04 19:04:18 (5.35 MB/s) - ‘test.en-kmb.kmb’ saved [230668/230668]\n",
            "\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>source_sentence</th>\n",
              "      <th>target_sentence</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>Dorcas “ abounded in good deeds and gifts of m...</td>\n",
              "      <td>Dorka , “ [ uavudile ] jimbote ni jimola [ ja ...</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>What will be considered in this article , and ...</td>\n",
              "      <td>Ihi i tua - nda di longa ku mbandu íii , ni mu...</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>Some names in this article have been changed .</td>\n",
              "      <td>Saí majina a a lungulula .</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                                     source_sentence                                    target_sentence\n",
              "0  Dorcas “ abounded in good deeds and gifts of m...  Dorka , “ [ uavudile ] jimbote ni jimola [ ja ...\n",
              "1  What will be considered in this article , and ...  Ihi i tua - nda di longa ku mbandu íii , ni mu...\n",
              "2     Some names in this article have been changed .                         Saí majina a a lungulula ."
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 11
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "caJcH-I5PL19",
        "colab_type": "text"
      },
      "source": [
        "## Word Alignments for Corpus Filtering"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "outputId": "70061918-b183-417d-a1ca-4448e55277a0",
        "id": "ISn3UWvPPjkI",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 104
        }
      },
      "source": [
        "# combine Bible + JW300 data for fast_align\n",
        "common = pd.concat([jw300])\n",
        "common['combined'] = common['source_sentence'] + ' ||| ' + common['target_sentence']\n",
        "common['combined'].values.tolist()[0:5]"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "['Table of Contents ||| Iala – mu',\n",
              " 'December 1 , 2010 ||| 1 Ua Katatu Ua 2011',\n",
              " 'Who Inhabit the Spirit Realm ? ||| O Kuiala ku Diulu Kuene Muene Athu mu Nzumbi',\n",
              " 'FROM OUR COVER ||| TU SANGA - MU UÉ MILONGI ÍII',\n",
              " '3 Someone Is Out There \\u200b — But Who ? ||| 3 Kuene Athu mu Nzumbi \\u200b — a Nanhi ?']"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 12
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "gPkjmeHweCtd",
        "colab_type": "code",
        "outputId": "b4099aee-2db8-49bb-b763-b2ff8e23d839",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 211
        }
      },
      "source": [
        "# Output to a file\n",
        "with open(\"word_align_file.txt\", \"w\") as wa_file:\n",
        "  for sample in common['combined'].values.tolist():\n",
        "    wa_file.write(sample+\"\\n\")\n",
        "\n",
        "! head word_align_file.txt"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Table of Contents ||| Iala – mu\n",
            "December 1 , 2010 ||| 1 Ua Katatu Ua 2011\n",
            "Who Inhabit the Spirit Realm ? ||| O Kuiala ku Diulu Kuene Muene Athu mu Nzumbi\n",
            "FROM OUR COVER ||| TU SANGA - MU UÉ MILONGI ÍII\n",
            "3 Someone Is Out There ​ — But Who ? ||| 3 Kuene Athu mu Nzumbi ​ — a Nanhi ?\n",
            "4 Visions of the Spirit Realm ||| 4 Isuma ia Athu a Tungu mu Nzumbi\n",
            "7 Contact With the Spirit Realm ||| 7 Tu Tena Kuzuela ni Athu a Tungu mu Nzumbi ?\n",
            "REGULAR FEATURES ||| TUA - NDA DI LONGA UÉ\n",
            "10 Did You Know ? ||| 10 Atangi a Madivulu Metu Ebhula . . .\n",
            "11 Draw Close to God ​ — He Knows “ the Heart of the Sons of Mankind ” ||| MILONGI PHALA KU DI LONGA 28 ia Kauana katé ku 6 ia Katanu MBANDU IA 11 - MIMBU : 49 , 74\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "r7TtC6ojfogR",
        "colab_type": "code",
        "outputId": "cb1df94b-3d65-4a19-c3d8-b088a528b953",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "# Generate word alignments\n",
        "! ./fast_align/build/fast_align -i word_align_file.txt -d -o -v -s > forward.align"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "ARG=i\n",
            "ARG=d\n",
            "ARG=o\n",
            "ARG=v\n",
            "ARG=s\n",
            "INITIAL PASS \n",
            ".................................................. [50000]\n",
            "...............................................\n",
            "expected target length = source length * 1.25735\n",
            "ITERATION 1\n",
            ".................................................. [50000]\n",
            "...............................................\n",
            "  log_e likelihood: -4.17171e+07\n",
            "  log_2 likelihood: -6.0185e+07\n",
            "     cross entropy: 29.8974\n",
            "        perplexity: 1e+09\n",
            "      posterior p0: 0.08\n",
            " posterior al-feat: -0.16963\n",
            "       size counts: 2676\n",
            "ITERATION 2\n",
            ".................................................. [50000]\n",
            "...............................................\n",
            "  log_e likelihood: -1.0431e+07\n",
            "  log_2 likelihood: -1.50487e+07\n",
            "     cross entropy: 7.47556\n",
            "        perplexity: 177.979\n",
            "      posterior p0: 0.0599725\n",
            " posterior al-feat: -0.138555\n",
            "       size counts: 2676\n",
            "  1  model al-feat: -0.139253 (tension=4)\n",
            "  2  model al-feat: -0.138985 (tension=4.01396)\n",
            "  3  model al-feat: -0.13882 (tension=4.02256)\n",
            "  4  model al-feat: -0.138719 (tension=4.02786)\n",
            "  5  model al-feat: -0.138656 (tension=4.03114)\n",
            "  6  model al-feat: -0.138618 (tension=4.03316)\n",
            "  7  model al-feat: -0.138594 (tension=4.03441)\n",
            "  8  model al-feat: -0.138579 (tension=4.03518)\n",
            "     final tension: 4.03566\n",
            "ITERATION 3\n",
            ".................................................. [50000]\n",
            "...............................................\n",
            "  log_e likelihood: -8.88493e+06\n",
            "  log_2 likelihood: -1.28182e+07\n",
            "     cross entropy: 6.36756\n",
            "        perplexity: 82.5707\n",
            "      posterior p0: 0.0555787\n",
            " posterior al-feat: -0.131422\n",
            "       size counts: 2676\n",
            "  1  model al-feat: -0.13857 (tension=4.03566)\n",
            "  2  model al-feat: -0.135876 (tension=4.17861)\n",
            "  3  model al-feat: -0.134234 (tension=4.26769)\n",
            "  4  model al-feat: -0.133212 (tension=4.32393)\n",
            "  5  model al-feat: -0.132567 (tension=4.35974)\n",
            "  6  model al-feat: -0.132157 (tension=4.38264)\n",
            "  7  model al-feat: -0.131895 (tension=4.39734)\n",
            "  8  model al-feat: -0.131727 (tension=4.4068)\n",
            "     final tension: 4.41288\n",
            "ITERATION 4\n",
            ".................................................. [50000]\n",
            "...............................................\n",
            "  log_e likelihood: -8.52964e+06\n",
            "  log_2 likelihood: -1.23057e+07\n",
            "     cross entropy: 6.11293\n",
            "        perplexity: 69.2111\n",
            "      posterior p0: 0.058605\n",
            " posterior al-feat: -0.12528\n",
            "       size counts: 2676\n",
            "  1  model al-feat: -0.131618 (tension=4.41288)\n",
            "  2  model al-feat: -0.129394 (tension=4.53964)\n",
            "  3  model al-feat: -0.12798 (tension=4.62192)\n",
            "  4  model al-feat: -0.127065 (tension=4.67592)\n",
            "  5  model al-feat: -0.126465 (tension=4.7116)\n",
            "  6  model al-feat: -0.126069 (tension=4.73529)\n",
            "  7  model al-feat: -0.125807 (tension=4.75106)\n",
            "  8  model al-feat: -0.125632 (tension=4.76158)\n",
            "     final tension: 4.76861\n",
            "ITERATION 5 (FINAL)\n",
            ".................................................. [50000]\n",
            "...............................................\n",
            "  log_e likelihood: -8.39616e+06\n",
            "  log_2 likelihood: -1.21131e+07\n",
            "     cross entropy: 6.01727\n",
            "        perplexity: 64.7709\n",
            "      posterior p0: 0\n",
            " posterior al-feat: 0\n",
            "       size counts: 2676\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "_YFtSFy2gHgn",
        "colab_type": "text"
      },
      "source": [
        "## Corpus Filtering"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "4qshK1wnggKM",
        "colab_type": "code",
        "outputId": "e5279e1d-de8a-46e4-ee5e-c739258dbb25",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 258
        }
      },
      "source": [
        "# add word alignment scores into the common dataframe\n",
        "scores = []\n",
        "with open('forward.align') as f:\n",
        "  for _, line in enumerate(f):\n",
        "    scores.append(float(line.split(' ||| ')[-1]))\n",
        "\n",
        "common['scores'] = scores\n",
        "common.head()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>source_sentence</th>\n",
              "      <th>target_sentence</th>\n",
              "      <th>combined</th>\n",
              "      <th>scores</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>Table of Contents</td>\n",
              "      <td>Iala – mu</td>\n",
              "      <td>Table of Contents ||| Iala – mu</td>\n",
              "      <td>-6.44585</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>December 1 , 2010</td>\n",
              "      <td>1 Ua Katatu Ua 2011</td>\n",
              "      <td>December 1 , 2010 ||| 1 Ua Katatu Ua 2011</td>\n",
              "      <td>-26.63540</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>Who Inhabit the Spirit Realm ?</td>\n",
              "      <td>O Kuiala ku Diulu Kuene Muene Athu mu Nzumbi</td>\n",
              "      <td>Who Inhabit the Spirit Realm ? ||| O Kuiala ku...</td>\n",
              "      <td>-32.64520</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>3</th>\n",
              "      <td>FROM OUR COVER</td>\n",
              "      <td>TU SANGA - MU UÉ MILONGI ÍII</td>\n",
              "      <td>FROM OUR COVER ||| TU SANGA - MU UÉ MILONGI ÍII</td>\n",
              "      <td>-17.60530</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>4</th>\n",
              "      <td>3 Someone Is Out There ​ — But Who ?</td>\n",
              "      <td>3 Kuene Athu mu Nzumbi ​ — a Nanhi ?</td>\n",
              "      <td>3 Someone Is Out There ​ — But Who ? ||| 3 Kue...</td>\n",
              "      <td>-34.61690</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                        source_sentence  ...    scores\n",
              "0                     Table of Contents  ...  -6.44585\n",
              "1                     December 1 , 2010  ... -26.63540\n",
              "2        Who Inhabit the Spirit Realm ?  ... -32.64520\n",
              "3                        FROM OUR COVER  ... -17.60530\n",
              "4  3 Someone Is Out There ​ — But Who ?  ... -34.61690\n",
              "\n",
              "[5 rows x 4 columns]"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 15
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "jhWPxbK0kyBP",
        "colab_type": "code",
        "outputId": "cfb2c28e-57b2-489c-ec1e-545f3ffe3edd",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 300
        }
      },
      "source": [
        "common.describe()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>scores</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>count</th>\n",
              "      <td>97218.000000</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>mean</th>\n",
              "      <td>-89.522981</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>std</th>\n",
              "      <td>59.855089</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>min</th>\n",
              "      <td>-1106.860000</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>25%</th>\n",
              "      <td>-119.555000</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>50%</th>\n",
              "      <td>-79.433800</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>75%</th>\n",
              "      <td>-47.867900</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>max</th>\n",
              "      <td>-1.122730</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "             scores\n",
              "count  97218.000000\n",
              "mean     -89.522981\n",
              "std       59.855089\n",
              "min    -1106.860000\n",
              "25%     -119.555000\n",
              "50%      -79.433800\n",
              "75%      -47.867900\n",
              "max       -1.122730"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 16
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "PB9t9Zt7k9Ps",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "# cut out anything below the 0.1 quantile (really bad)\n",
        "threshold = common.quantile(0.1, axis=0)['scores']\n",
        "common_clean = common[common['scores'] > threshold]"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "_PF8Z7UylbsW",
        "colab_type": "code",
        "outputId": "64f7e6cf-74f5-4782-af2a-358981d9e24b",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        }
      },
      "source": [
        "# how many did we lose?\n",
        "len(common_clean)/len(common)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "0.8999979427678002"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 18
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "WvH3ugvylv7v",
        "colab_type": "text"
      },
      "source": [
        "## Other pre-processing"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "LK1AiflclpdM",
        "colab_type": "code",
        "outputId": "d54bf3bb-3f7c-45d4-f2ef-947aae6d3188",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        }
      },
      "source": [
        "# drop test data from common\n",
        "df_pp = common_clean[~common_clean['source_sentence'].isin(df_test['source_sentence'].values)]\n",
        "df_pp = df_pp[~df_pp['target_sentence'].isin(df_test['target_sentence'].values)]\n",
        "\n",
        "# remove duplicates\n",
        "df_pp.drop_duplicates(inplace=True)\n",
        "\n",
        "# remove conflicting translations\n",
        "df_pp.drop_duplicates(subset='source_sentence', inplace=True)\n",
        "df_pp.drop_duplicates(subset='target_sentence', inplace=True)\n",
        "\n",
        "# what's left in terms of number of samples?\n",
        "len(df_pp)/len(common)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "0.8135530457322718"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 19
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "iGYR610PxdGK",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "# reset the index of the training set after filtering\n",
        "df_pp.reset_index(drop=False, inplace=True)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "_IuSBECCoDsW",
        "colab_type": "code",
        "outputId": "beca0972-aeb1-484d-f5b7-dd4218fc7be2",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "# Remove samples from the training data set if they \"almost overlap\" with the\n",
        "# samples in the test set.\n",
        "\n",
        "# Filtering function. Adjust pad to narrow down the candidate matches to\n",
        "# within a certain length of characters of the given sample.\n",
        "def fuzzfilter(sample, candidates, pad):\n",
        "  candidates = [x for x in candidates if len(x) <= len(sample)+pad and len(x) >= len(sample)-pad] \n",
        "  #candidates = [x for x in candidates if len(x) >= len(sample)-pad]\n",
        "  if len(candidates) > 0:\n",
        "    return process.extractOne(sample, candidates)[1]\n",
        "  else:\n",
        "    return np.nan\n",
        "\n",
        "# NOTE - This might run slow depending on the size of your training set. We are\n",
        "# printing some information to help you track how long it would take. \n",
        "eng_test = df_test['source_sentence'].values.tolist()\n",
        "scores = []\n",
        "start_time = time.time()\n",
        "for idx, row in df_pp.iterrows():\n",
        "  scores.append(fuzzfilter(row['source_sentence'], eng_test, 5))\n",
        "  if idx % 1000 == 0:\n",
        "    hours, rem = divmod(time.time() - start_time, 3600)\n",
        "    minutes, seconds = divmod(rem, 60)\n",
        "    print(\"{:0>2}:{:0>2}:{:05.2f}\".format(int(hours),int(minutes),seconds), \"%0.2f percent complete\" % (100.0*float(idx)/float(len(df_pp))))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "00:00:00.02 0.00 percent complete\n",
            "00:00:15.99 1.26 percent complete\n",
            "00:00:31.60 2.53 percent complete\n",
            "00:00:46.84 3.79 percent complete\n",
            "00:01:02.81 5.06 percent complete\n",
            "00:01:18.16 6.32 percent complete\n",
            "00:01:33.57 7.59 percent complete\n",
            "00:01:48.58 8.85 percent complete\n",
            "00:02:04.89 10.11 percent complete\n",
            "00:02:21.35 11.38 percent complete\n",
            "00:02:36.99 12.64 percent complete\n",
            "00:02:52.40 13.91 percent complete\n",
            "00:03:07.66 15.17 percent complete\n",
            "00:03:23.21 16.44 percent complete\n",
            "00:03:38.89 17.70 percent complete\n",
            "00:03:54.36 18.97 percent complete\n",
            "00:04:10.44 20.23 percent complete\n",
            "00:04:26.39 21.49 percent complete\n",
            "00:04:43.23 22.76 percent complete\n",
            "00:04:59.46 24.02 percent complete\n",
            "00:05:16.05 25.29 percent complete\n",
            "00:05:31.91 26.55 percent complete\n",
            "00:05:48.36 27.82 percent complete\n",
            "00:06:04.70 29.08 percent complete\n",
            "00:06:21.21 30.34 percent complete\n",
            "00:06:37.73 31.61 percent complete\n",
            "00:06:53.93 32.87 percent complete\n",
            "00:07:10.20 34.14 percent complete\n",
            "00:07:26.56 35.40 percent complete\n",
            "00:07:42.11 36.67 percent complete\n",
            "00:07:58.07 37.93 percent complete\n",
            "00:08:14.06 39.19 percent complete\n",
            "00:08:29.90 40.46 percent complete\n",
            "00:08:46.04 41.72 percent complete\n",
            "00:09:02.80 42.99 percent complete\n",
            "00:09:19.59 44.25 percent complete\n",
            "00:09:35.92 45.52 percent complete\n",
            "00:09:51.77 46.78 percent complete\n",
            "00:10:08.14 48.05 percent complete\n",
            "00:10:24.54 49.31 percent complete\n",
            "00:10:40.48 50.57 percent complete\n",
            "00:10:56.29 51.84 percent complete\n",
            "00:11:12.00 53.10 percent complete\n",
            "00:11:29.30 54.37 percent complete\n",
            "00:11:46.04 55.63 percent complete\n",
            "00:12:02.19 56.90 percent complete\n",
            "00:12:17.97 58.16 percent complete\n",
            "00:12:35.13 59.42 percent complete\n",
            "00:12:50.94 60.69 percent complete\n",
            "00:13:06.72 61.95 percent complete\n",
            "00:13:23.85 63.22 percent complete\n",
            "00:13:42.28 64.48 percent complete\n",
            "00:13:59.85 65.75 percent complete\n",
            "00:14:17.12 67.01 percent complete\n",
            "00:14:34.22 68.27 percent complete\n",
            "00:14:51.67 69.54 percent complete\n",
            "00:15:08.71 70.80 percent complete\n",
            "00:15:25.66 72.07 percent complete\n",
            "00:15:43.60 73.33 percent complete\n",
            "00:16:00.66 74.60 percent complete\n",
            "00:16:17.58 75.86 percent complete\n",
            "00:16:34.54 77.13 percent complete\n",
            "00:16:51.25 78.39 percent complete\n",
            "00:17:08.11 79.65 percent complete\n",
            "00:17:25.00 80.92 percent complete\n",
            "00:17:41.94 82.18 percent complete\n",
            "00:17:58.92 83.45 percent complete\n",
            "00:18:17.12 84.71 percent complete\n",
            "00:18:34.94 85.98 percent complete\n",
            "00:18:52.80 87.24 percent complete\n",
            "00:19:09.40 88.50 percent complete\n",
            "00:19:26.94 89.77 percent complete\n",
            "00:19:43.81 91.03 percent complete\n",
            "00:20:01.44 92.30 percent complete\n",
            "00:20:18.72 93.56 percent complete\n",
            "00:20:35.76 94.83 percent complete\n",
            "00:20:53.76 96.09 percent complete\n",
            "00:21:11.23 97.35 percent complete\n",
            "00:21:28.30 98.62 percent complete\n",
            "00:21:45.93 99.88 percent complete\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "uBgUuWVX8V13",
        "colab_type": "code",
        "outputId": "c65da7a6-f253-4389-b3fb-9c9bcae047c7",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 34
        }
      },
      "source": [
        "# Filter out \"almost overlapping samples\"\n",
        "df_pp['scores'] = scores\n",
        "df_pp = df_pp[df_pp['scores'] < 95]\n",
        "\n",
        "# what's left in terms of number of samples?\n",
        "len(df_pp)/len(common)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "0.7892982780966488"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 22
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "RdHgU5xrmOKz",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "## Lower case the corpus\n",
        "df_pp[\"source_sentence\"] = df_pp[\"source_sentence\"].str.lower()\n",
        "df_pp[\"target_sentence\"] = df_pp[\"target_sentence\"].str.lower()\n",
        "df_test[\"source_sentence\"] = df_test[\"source_sentence\"].str.lower()\n",
        "df_test[\"target_sentence\"] = df_test[\"target_sentence\"].str.lower()\n",
        "\n",
        "# shuffle the training/dev data\n",
        "df_pp = df_pp.sample(frac=1).reset_index(drop=True)\n",
        "\n",
        "# Do the split between dev/train\n",
        "num_dev_patterns = 1000\n",
        "dev = df_pp.tail(num_dev_patterns)\n",
        "stripped = df_pp.drop(df_pp.tail(num_dev_patterns).index)\n",
        "\n",
        "# output the final parallel corpus files\n",
        "with open(\"train.\"+source_language, \"w\") as src_file, open(\"train.\"+target_language, \"w\") as trg_file:\n",
        "  for index, row in stripped.iterrows():\n",
        "    src_file.write(row[\"source_sentence\"]+\"\\n\")\n",
        "    trg_file.write(row[\"target_sentence\"]+\"\\n\")\n",
        "    \n",
        "with open(\"dev.\"+source_language, \"w\") as src_file, open(\"dev.\"+target_language, \"w\") as trg_file:\n",
        "  for index, row in dev.iterrows():\n",
        "    src_file.write(row[\"source_sentence\"]+\"\\n\")\n",
        "    trg_file.write(row[\"target_sentence\"]+\"\\n\")\n",
        "\n",
        "with open(\"test.\"+source_language, \"w\") as src_file, open(\"test.\"+target_language, \"w\") as trg_file:\n",
        "  for index, row in df_test.iterrows():\n",
        "    src_file.write(row[\"source_sentence\"]+\"\\n\")\n",
        "    trg_file.write(row[\"target_sentence\"]+\"\\n\")"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "WugPZnZ_mv5H",
        "colab_type": "code",
        "outputId": "d126a0ce-ec38-4751-c459-c2569fb052e0",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 211
        }
      },
      "source": [
        "! head train.en"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "some may feel that they do not need anyone to explain the bible to them .\n",
            "most of us have very busy lives , but we should not let anything , including our responsibilities , stop us from reading the bible .\n",
            "giving may also lower stress and blood pressure .\n",
            "jehovah created all things\n",
            "but what can be done if the marital bond is strained ?\n",
            "( b ) explain how it became clear that the governing body was different from the watch tower society .\n",
            "this issue of the watchtower examines the bible’s claim that it can guide us in every aspect of life .\n",
            "then you throw them out onto the ground all at once .\n",
            "having no arms , i can fully empathize with those who have limitations .\n",
            "but even in such a situation , a wife will do what she can to teach her children the truth .\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "N9lWcwN2m0O2",
        "colab_type": "code",
        "outputId": "1e9a1f52-de8f-4ce1-e00e-f731d1c1c121",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 211
        }
      },
      "source": [
        "! head train.kmb"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "nange sai athu a banza kuila , ka bhingi muthu phala ku a jimbuluila o milongi ia bibidia .\n",
            "( josuué 1 : 8 ) tuala ni ikalakalu iavulu , kidi muene , maji ki tua tokala kuehela kima ku tu fidisa kutanga o bibidia , né muene o salu ietu .\n",
            "o kubhana ku tu kuatekesa ue kusosolola o kuthandanganha , ni kulenga kua manhinga mu mixibha ietu .\n",
            "mukonda jihova , muéne ua bhange o ima ioso\n",
            "maji ihi ia tokala o ku bhanga se mu ukaza mu moneka maka ?\n",
            "( b ) jimbulula kiebhi o jiphange kiéza mu kuijiia kuila , o kibuka kia utuminu ki ki lungile ni sociedade torre de vigia .\n",
            "o kadivulu kaka o mulangidi , ka - nda zuela se kiebhi o milongi ia bibidia i tena ku tu kuatekesa ku muenhu uetu lelu .\n",
            "eie u ji lundulula joso bhoxi .\n",
            "mukonda dia kukamba o maku , ngi tena kuivua o ndolo ivua ió ala uá ni unema .\n",
            "né muene mu ithangana kála iii , o muhatu ua bhingi kubhanga ioso , phala kulonga o itumu ia jihova ku tuana tuê .\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "eNbaZnhznC8-",
        "colab_type": "text"
      },
      "source": [
        "## Subword BPE Tokens"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "PlSQH_bQm82c",
        "colab_type": "code",
        "outputId": "4dfde776-d0e2-4208-827b-75faac7f5740",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 384
        }
      },
      "source": [
        "# Do BPE\n",
        "! subword-nmt learn-joint-bpe-and-vocab --input train.$src train.$tgt -s 4000 -o bpe.codes.4000 --write-vocabulary vocab.$src vocab.$tgt\n",
        "\n",
        "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < train.$src > train.bpe.$src\n",
        "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < train.$tgt > train.bpe.$tgt\n",
        "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < dev.$src > dev.bpe.$src\n",
        "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < dev.$tgt > dev.bpe.$tgt\n",
        "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < test.$src > test.bpe.$src\n",
        "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < test.$tgt > test.bpe.$tgt\n",
        "\n",
        "# Create directory, move everyone we care about to the correct location\n",
        "! mkdir -p $data_path\n",
        "! cp train.* $data_path\n",
        "! cp test.* $data_path\n",
        "! cp dev.* $data_path\n",
        "! cp bpe.codes.4000 $data_path\n",
        "! ls $data_path\n",
        "\n",
        "# Create that vocab using build_vocab\n",
        "! sudo chmod 777 joeynmt/scripts/build_vocab.py\n",
        "! joeynmt/scripts/build_vocab.py joeynmt/data/$src$tgt/train.bpe.$src joeynmt/data/$src$tgt/train.bpe.$tgt --output_path joeynmt/data/$src$tgt/vocab.txt\n",
        "\n",
        "# Some output\n",
        "! echo \"BPE Target Sentences\"\n",
        "! tail -n 5 test.bpe.$tgt\n",
        "! echo \"Combined BPE Vocab\"\n",
        "! tail -n 10 joeynmt/data/en$tgt/vocab.txt"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "bpe.codes.4000\tdev.en\t     test.bpe.kmb  train.bpe.en   train.kmb\n",
            "dev.bpe.en\tdev.kmb      test.en\t   train.bpe.kmb\n",
            "dev.bpe.kmb\ttest.bpe.en  test.kmb\t   train.en\n",
            "BPE Target Sentences\n",
            "o ngu@@ b@@ u ya ku@@ xikana ( tala o kaxi 12 - 14 )\n",
            "o ka@@ pas@@ ete ka kubh@@ uluka ( tala o kaxi 15 - 18 )\n",
            "nga mono kwila o athu a xikina dingi se a mona kwila ey@@ e wa zolo mwene o bibidya , wa mu bhanga yoso i u tena phala ku a kwatekesa . ”\n",
            "o xi@@ bhata ya nzumbi ikôla ( tala o kaxi 19 - 20 )\n",
            "ni ki@@ kwat@@ ek@@ esu kya jihova tu tena kubh@@ ânga nê !\n",
            "Combined BPE Vocab\n",
            "urrec@@\n",
            "pher@@\n",
            "danii@@\n",
            "espe@@\n",
            "enhu\n",
            "ould\n",
            "beh@@\n",
            "paradi@@\n",
            "effor@@\n",
            "kobo\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "mox2nEPXnnOe",
        "colab_type": "text"
      },
      "source": [
        "## JoeyNMT Config"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ugHbYjQPnNo4",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "# This creates the config file for our JoeyNMT system. \n",
        "name = '%s%s' % (source_language, target_language)\n",
        "\n",
        "config = \"\"\"\n",
        "name: \"{name}_transformer\"\n",
        "\n",
        "data:\n",
        "    src: \"{source_language}\"\n",
        "    trg: \"{target_language}\"\n",
        "    train: \"data/{name}/train.bpe\"\n",
        "    dev:   \"data/{name}/dev.bpe\"\n",
        "    test:  \"data/{name}/test.bpe\"\n",
        "    level: \"bpe\"\n",
        "    lowercase: False\n",
        "    max_sent_length: 100\n",
        "    src_vocab: \"data/{name}/vocab.txt\"\n",
        "    trg_vocab: \"data/{name}/vocab.txt\"\n",
        "\n",
        "testing:\n",
        "    beam_size: 5\n",
        "    alpha: 1.0\n",
        "\n",
        "training:\n",
        "    #load_model: \"models/{name}_transformer/12000.ckpt\" # if given, load a pre-trained model from this checkpoint\n",
        "    random_seed: 42\n",
        "    optimizer: \"adam\"\n",
        "    normalization: \"tokens\"\n",
        "    adam_betas: [0.9, 0.999] \n",
        "    scheduling: \"noam\"            # Try switching from plateau to Noam scheduling\n",
        "    learning_rate_factor: 0.5       # factor for Noam scheduler (used with Transformer)\n",
        "    learning_rate_warmup: 1000      # warmup steps for Noam scheduler (used with Transformer)\n",
        "    patience: 8\n",
        "    decrease_factor: 0.7\n",
        "    loss: \"crossentropy\"\n",
        "    learning_rate: 0.0002\n",
        "    learning_rate_min: 0.00000001\n",
        "    weight_decay: 0.0\n",
        "    label_smoothing: 0.1\n",
        "    batch_size: 4096\n",
        "    batch_type: \"token\"\n",
        "    eval_batch_size: 3600\n",
        "    eval_batch_type: \"token\"\n",
        "    batch_multiplier: 1\n",
        "    early_stopping_metric: \"eval_metric\" # \"ppl\"\n",
        "    epochs: 40\n",
        "    validation_freq: 2000\n",
        "    logging_freq: 200\n",
        "    eval_metric: \"bleu\"\n",
        "    model_dir: \"models/{name}_transformer\"\n",
        "    overwrite: True\n",
        "    shuffle: True\n",
        "    use_cuda: True\n",
        "    max_output_length: 100\n",
        "    print_valid_sents: [0, 1, 2, 3]\n",
        "    keep_last_ckpts: 3\n",
        "\n",
        "model:\n",
        "    initializer: \"xavier\"\n",
        "    bias_initializer: \"zeros\"\n",
        "    init_gain: 1.0\n",
        "    embed_initializer: \"xavier\"\n",
        "    embed_init_gain: 1.0\n",
        "    tied_embeddings: True\n",
        "    tied_softmax: True\n",
        "    encoder:\n",
        "        type: \"transformer\"\n",
        "        num_layers: 6\n",
        "        num_heads: 8\n",
        "        embeddings:\n",
        "            embedding_dim: 512\n",
        "            scale: True\n",
        "            dropout: 0.\n",
        "        # typically ff_size = 4 x hidden_size\n",
        "        hidden_size: 512\n",
        "        ff_size: 2048\n",
        "        dropout: 0.3\n",
        "    decoder:\n",
        "        type: \"transformer\"\n",
        "        num_layers: 6\n",
        "        num_heads: 8\n",
        "        embeddings:\n",
        "            embedding_dim: 512\n",
        "            scale: True\n",
        "            dropout: 0.\n",
        "        # typically ff_size = 4 x hidden_size\n",
        "        hidden_size: 512\n",
        "        ff_size: 2048\n",
        "        dropout: 0.3\n",
        "\"\"\".format(name=name, source_language=source_language, target_language=target_language)\n",
        "with open(\"joeynmt/configs/transformer_{name}.yaml\".format(name=name),'w') as f:\n",
        "    f.write(config)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "eNQ_9LO4n04W",
        "colab_type": "text"
      },
      "source": [
        "## Train the model"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "hWwNavEHnxs8",
        "colab_type": "code",
        "outputId": "8e3ac93f-5914-49f1-afd9-5b452e19b4e8",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "!cd joeynmt; python3 -m joeynmt train configs/transformer_$src$tgt.yaml"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "2020-02-04 19:34:52,715 Hello! This is Joey-NMT.\n",
            "2020-02-04 19:34:54,040 Total params: 46249984\n",
            "2020-02-04 19:34:54,042 Trainable parameters: ['decoder.layer_norm.bias', 'decoder.layer_norm.weight', 'decoder.layers.0.dec_layer_norm.bias', 'decoder.layers.0.dec_layer_norm.weight', 'decoder.layers.0.feed_forward.layer_norm.bias', 'decoder.layers.0.feed_forward.layer_norm.weight', 'decoder.layers.0.feed_forward.pwff_layer.0.bias', 'decoder.layers.0.feed_forward.pwff_layer.0.weight', 'decoder.layers.0.feed_forward.pwff_layer.3.bias', 'decoder.layers.0.feed_forward.pwff_layer.3.weight', 'decoder.layers.0.src_trg_att.k_layer.bias', 'decoder.layers.0.src_trg_att.k_layer.weight', 'decoder.layers.0.src_trg_att.output_layer.bias', 'decoder.layers.0.src_trg_att.output_layer.weight', 'decoder.layers.0.src_trg_att.q_layer.bias', 'decoder.layers.0.src_trg_att.q_layer.weight', 'decoder.layers.0.src_trg_att.v_layer.bias', 'decoder.layers.0.src_trg_att.v_layer.weight', 'decoder.layers.0.trg_trg_att.k_layer.bias', 'decoder.layers.0.trg_trg_att.k_layer.weight', 'decoder.layers.0.trg_trg_att.output_layer.bias', 'decoder.layers.0.trg_trg_att.output_layer.weight', 'decoder.layers.0.trg_trg_att.q_layer.bias', 'decoder.layers.0.trg_trg_att.q_layer.weight', 'decoder.layers.0.trg_trg_att.v_layer.bias', 'decoder.layers.0.trg_trg_att.v_layer.weight', 'decoder.layers.0.x_layer_norm.bias', 'decoder.layers.0.x_layer_norm.weight', 'decoder.layers.1.dec_layer_norm.bias', 'decoder.layers.1.dec_layer_norm.weight', 'decoder.layers.1.feed_forward.layer_norm.bias', 'decoder.layers.1.feed_forward.layer_norm.weight', 'decoder.layers.1.feed_forward.pwff_layer.0.bias', 'decoder.layers.1.feed_forward.pwff_layer.0.weight', 'decoder.layers.1.feed_forward.pwff_layer.3.bias', 'decoder.layers.1.feed_forward.pwff_layer.3.weight', 'decoder.layers.1.src_trg_att.k_layer.bias', 'decoder.layers.1.src_trg_att.k_layer.weight', 'decoder.layers.1.src_trg_att.output_layer.bias', 'decoder.layers.1.src_trg_att.output_layer.weight', 'decoder.layers.1.src_trg_att.q_layer.bias', 'decoder.layers.1.src_trg_att.q_layer.weight', 'decoder.layers.1.src_trg_att.v_layer.bias', 'decoder.layers.1.src_trg_att.v_layer.weight', 'decoder.layers.1.trg_trg_att.k_layer.bias', 'decoder.layers.1.trg_trg_att.k_layer.weight', 'decoder.layers.1.trg_trg_att.output_layer.bias', 'decoder.layers.1.trg_trg_att.output_layer.weight', 'decoder.layers.1.trg_trg_att.q_layer.bias', 'decoder.layers.1.trg_trg_att.q_layer.weight', 'decoder.layers.1.trg_trg_att.v_layer.bias', 'decoder.layers.1.trg_trg_att.v_layer.weight', 'decoder.layers.1.x_layer_norm.bias', 'decoder.layers.1.x_layer_norm.weight', 'decoder.layers.2.dec_layer_norm.bias', 'decoder.layers.2.dec_layer_norm.weight', 'decoder.layers.2.feed_forward.layer_norm.bias', 'decoder.layers.2.feed_forward.layer_norm.weight', 'decoder.layers.2.feed_forward.pwff_layer.0.bias', 'decoder.layers.2.feed_forward.pwff_layer.0.weight', 'decoder.layers.2.feed_forward.pwff_layer.3.bias', 'decoder.layers.2.feed_forward.pwff_layer.3.weight', 'decoder.layers.2.src_trg_att.k_layer.bias', 'decoder.layers.2.src_trg_att.k_layer.weight', 'decoder.layers.2.src_trg_att.output_layer.bias', 'decoder.layers.2.src_trg_att.output_layer.weight', 'decoder.layers.2.src_trg_att.q_layer.bias', 'decoder.layers.2.src_trg_att.q_layer.weight', 'decoder.layers.2.src_trg_att.v_layer.bias', 'decoder.layers.2.src_trg_att.v_layer.weight', 'decoder.layers.2.trg_trg_att.k_layer.bias', 'decoder.layers.2.trg_trg_att.k_layer.weight', 'decoder.layers.2.trg_trg_att.output_layer.bias', 'decoder.layers.2.trg_trg_att.output_layer.weight', 'decoder.layers.2.trg_trg_att.q_layer.bias', 'decoder.layers.2.trg_trg_att.q_layer.weight', 'decoder.layers.2.trg_trg_att.v_layer.bias', 'decoder.layers.2.trg_trg_att.v_layer.weight', 'decoder.layers.2.x_layer_norm.bias', 'decoder.layers.2.x_layer_norm.weight', 'decoder.layers.3.dec_layer_norm.bias', 'decoder.layers.3.dec_layer_norm.weight', 'decoder.layers.3.feed_forward.layer_norm.bias', 'decoder.layers.3.feed_forward.layer_norm.weight', 'decoder.layers.3.feed_forward.pwff_layer.0.bias', 'decoder.layers.3.feed_forward.pwff_layer.0.weight', 'decoder.layers.3.feed_forward.pwff_layer.3.bias', 'decoder.layers.3.feed_forward.pwff_layer.3.weight', 'decoder.layers.3.src_trg_att.k_layer.bias', 'decoder.layers.3.src_trg_att.k_layer.weight', 'decoder.layers.3.src_trg_att.output_layer.bias', 'decoder.layers.3.src_trg_att.output_layer.weight', 'decoder.layers.3.src_trg_att.q_layer.bias', 'decoder.layers.3.src_trg_att.q_layer.weight', 'decoder.layers.3.src_trg_att.v_layer.bias', 'decoder.layers.3.src_trg_att.v_layer.weight', 'decoder.layers.3.trg_trg_att.k_layer.bias', 'decoder.layers.3.trg_trg_att.k_layer.weight', 'decoder.layers.3.trg_trg_att.output_layer.bias', 'decoder.layers.3.trg_trg_att.output_layer.weight', 'decoder.layers.3.trg_trg_att.q_layer.bias', 'decoder.layers.3.trg_trg_att.q_layer.weight', 'decoder.layers.3.trg_trg_att.v_layer.bias', 'decoder.layers.3.trg_trg_att.v_layer.weight', 'decoder.layers.3.x_layer_norm.bias', 'decoder.layers.3.x_layer_norm.weight', 'decoder.layers.4.dec_layer_norm.bias', 'decoder.layers.4.dec_layer_norm.weight', 'decoder.layers.4.feed_forward.layer_norm.bias', 'decoder.layers.4.feed_forward.layer_norm.weight', 'decoder.layers.4.feed_forward.pwff_layer.0.bias', 'decoder.layers.4.feed_forward.pwff_layer.0.weight', 'decoder.layers.4.feed_forward.pwff_layer.3.bias', 'decoder.layers.4.feed_forward.pwff_layer.3.weight', 'decoder.layers.4.src_trg_att.k_layer.bias', 'decoder.layers.4.src_trg_att.k_layer.weight', 'decoder.layers.4.src_trg_att.output_layer.bias', 'decoder.layers.4.src_trg_att.output_layer.weight', 'decoder.layers.4.src_trg_att.q_layer.bias', 'decoder.layers.4.src_trg_att.q_layer.weight', 'decoder.layers.4.src_trg_att.v_layer.bias', 'decoder.layers.4.src_trg_att.v_layer.weight', 'decoder.layers.4.trg_trg_att.k_layer.bias', 'decoder.layers.4.trg_trg_att.k_layer.weight', 'decoder.layers.4.trg_trg_att.output_layer.bias', 'decoder.layers.4.trg_trg_att.output_layer.weight', 'decoder.layers.4.trg_trg_att.q_layer.bias', 'decoder.layers.4.trg_trg_att.q_layer.weight', 'decoder.layers.4.trg_trg_att.v_layer.bias', 'decoder.layers.4.trg_trg_att.v_layer.weight', 'decoder.layers.4.x_layer_norm.bias', 'decoder.layers.4.x_layer_norm.weight', 'decoder.layers.5.dec_layer_norm.bias', 'decoder.layers.5.dec_layer_norm.weight', 'decoder.layers.5.feed_forward.layer_norm.bias', 'decoder.layers.5.feed_forward.layer_norm.weight', 'decoder.layers.5.feed_forward.pwff_layer.0.bias', 'decoder.layers.5.feed_forward.pwff_layer.0.weight', 'decoder.layers.5.feed_forward.pwff_layer.3.bias', 'decoder.layers.5.feed_forward.pwff_layer.3.weight', 'decoder.layers.5.src_trg_att.k_layer.bias', 'decoder.layers.5.src_trg_att.k_layer.weight', 'decoder.layers.5.src_trg_att.output_layer.bias', 'decoder.layers.5.src_trg_att.output_layer.weight', 'decoder.layers.5.src_trg_att.q_layer.bias', 'decoder.layers.5.src_trg_att.q_layer.weight', 'decoder.layers.5.src_trg_att.v_layer.bias', 'decoder.layers.5.src_trg_att.v_layer.weight', 'decoder.layers.5.trg_trg_att.k_layer.bias', 'decoder.layers.5.trg_trg_att.k_layer.weight', 'decoder.layers.5.trg_trg_att.output_layer.bias', 'decoder.layers.5.trg_trg_att.output_layer.weight', 'decoder.layers.5.trg_trg_att.q_layer.bias', 'decoder.layers.5.trg_trg_att.q_layer.weight', 'decoder.layers.5.trg_trg_att.v_layer.bias', 'decoder.layers.5.trg_trg_att.v_layer.weight', 'decoder.layers.5.x_layer_norm.bias', 'decoder.layers.5.x_layer_norm.weight', 'encoder.layer_norm.bias', 'encoder.layer_norm.weight', 'encoder.layers.0.feed_forward.layer_norm.bias', 'encoder.layers.0.feed_forward.layer_norm.weight', 'encoder.layers.0.feed_forward.pwff_layer.0.bias', 'encoder.layers.0.feed_forward.pwff_layer.0.weight', 'encoder.layers.0.feed_forward.pwff_layer.3.bias', 'encoder.layers.0.feed_forward.pwff_layer.3.weight', 'encoder.layers.0.layer_norm.bias', 'encoder.layers.0.layer_norm.weight', 'encoder.layers.0.src_src_att.k_layer.bias', 'encoder.layers.0.src_src_att.k_layer.weight', 'encoder.layers.0.src_src_att.output_layer.bias', 'encoder.layers.0.src_src_att.output_layer.weight', 'encoder.layers.0.src_src_att.q_layer.bias', 'encoder.layers.0.src_src_att.q_layer.weight', 'encoder.layers.0.src_src_att.v_layer.bias', 'encoder.layers.0.src_src_att.v_layer.weight', 'encoder.layers.1.feed_forward.layer_norm.bias', 'encoder.layers.1.feed_forward.layer_norm.weight', 'encoder.layers.1.feed_forward.pwff_layer.0.bias', 'encoder.layers.1.feed_forward.pwff_layer.0.weight', 'encoder.layers.1.feed_forward.pwff_layer.3.bias', 'encoder.layers.1.feed_forward.pwff_layer.3.weight', 'encoder.layers.1.layer_norm.bias', 'encoder.layers.1.layer_norm.weight', 'encoder.layers.1.src_src_att.k_layer.bias', 'encoder.layers.1.src_src_att.k_layer.weight', 'encoder.layers.1.src_src_att.output_layer.bias', 'encoder.layers.1.src_src_att.output_layer.weight', 'encoder.layers.1.src_src_att.q_layer.bias', 'encoder.layers.1.src_src_att.q_layer.weight', 'encoder.layers.1.src_src_att.v_layer.bias', 'encoder.layers.1.src_src_att.v_layer.weight', 'encoder.layers.2.feed_forward.layer_norm.bias', 'encoder.layers.2.feed_forward.layer_norm.weight', 'encoder.layers.2.feed_forward.pwff_layer.0.bias', 'encoder.layers.2.feed_forward.pwff_layer.0.weight', 'encoder.layers.2.feed_forward.pwff_layer.3.bias', 'encoder.layers.2.feed_forward.pwff_layer.3.weight', 'encoder.layers.2.layer_norm.bias', 'encoder.layers.2.layer_norm.weight', 'encoder.layers.2.src_src_att.k_layer.bias', 'encoder.layers.2.src_src_att.k_layer.weight', 'encoder.layers.2.src_src_att.output_layer.bias', 'encoder.layers.2.src_src_att.output_layer.weight', 'encoder.layers.2.src_src_att.q_layer.bias', 'encoder.layers.2.src_src_att.q_layer.weight', 'encoder.layers.2.src_src_att.v_layer.bias', 'encoder.layers.2.src_src_att.v_layer.weight', 'encoder.layers.3.feed_forward.layer_norm.bias', 'encoder.layers.3.feed_forward.layer_norm.weight', 'encoder.layers.3.feed_forward.pwff_layer.0.bias', 'encoder.layers.3.feed_forward.pwff_layer.0.weight', 'encoder.layers.3.feed_forward.pwff_layer.3.bias', 'encoder.layers.3.feed_forward.pwff_layer.3.weight', 'encoder.layers.3.layer_norm.bias', 'encoder.layers.3.layer_norm.weight', 'encoder.layers.3.src_src_att.k_layer.bias', 'encoder.layers.3.src_src_att.k_layer.weight', 'encoder.layers.3.src_src_att.output_layer.bias', 'encoder.layers.3.src_src_att.output_layer.weight', 'encoder.layers.3.src_src_att.q_layer.bias', 'encoder.layers.3.src_src_att.q_layer.weight', 'encoder.layers.3.src_src_att.v_layer.bias', 'encoder.layers.3.src_src_att.v_layer.weight', 'encoder.layers.4.feed_forward.layer_norm.bias', 'encoder.layers.4.feed_forward.layer_norm.weight', 'encoder.layers.4.feed_forward.pwff_layer.0.bias', 'encoder.layers.4.feed_forward.pwff_layer.0.weight', 'encoder.layers.4.feed_forward.pwff_layer.3.bias', 'encoder.layers.4.feed_forward.pwff_layer.3.weight', 'encoder.layers.4.layer_norm.bias', 'encoder.layers.4.layer_norm.weight', 'encoder.layers.4.src_src_att.k_layer.bias', 'encoder.layers.4.src_src_att.k_layer.weight', 'encoder.layers.4.src_src_att.output_layer.bias', 'encoder.layers.4.src_src_att.output_layer.weight', 'encoder.layers.4.src_src_att.q_layer.bias', 'encoder.layers.4.src_src_att.q_layer.weight', 'encoder.layers.4.src_src_att.v_layer.bias', 'encoder.layers.4.src_src_att.v_layer.weight', 'encoder.layers.5.feed_forward.layer_norm.bias', 'encoder.layers.5.feed_forward.layer_norm.weight', 'encoder.layers.5.feed_forward.pwff_layer.0.bias', 'encoder.layers.5.feed_forward.pwff_layer.0.weight', 'encoder.layers.5.feed_forward.pwff_layer.3.bias', 'encoder.layers.5.feed_forward.pwff_layer.3.weight', 'encoder.layers.5.layer_norm.bias', 'encoder.layers.5.layer_norm.weight', 'encoder.layers.5.src_src_att.k_layer.bias', 'encoder.layers.5.src_src_att.k_layer.weight', 'encoder.layers.5.src_src_att.output_layer.bias', 'encoder.layers.5.src_src_att.output_layer.weight', 'encoder.layers.5.src_src_att.q_layer.bias', 'encoder.layers.5.src_src_att.q_layer.weight', 'encoder.layers.5.src_src_att.v_layer.bias', 'encoder.layers.5.src_src_att.v_layer.weight', 'src_embed.lut.weight']\n",
            "2020-02-04 19:35:03,901 cfg.name                           : enkmb_transformer\n",
            "2020-02-04 19:35:03,901 cfg.data.src                       : en\n",
            "2020-02-04 19:35:03,901 cfg.data.trg                       : kmb\n",
            "2020-02-04 19:35:03,901 cfg.data.train                     : data/enkmb/train.bpe\n",
            "2020-02-04 19:35:03,901 cfg.data.dev                       : data/enkmb/dev.bpe\n",
            "2020-02-04 19:35:03,901 cfg.data.test                      : data/enkmb/test.bpe\n",
            "2020-02-04 19:35:03,901 cfg.data.level                     : bpe\n",
            "2020-02-04 19:35:03,901 cfg.data.lowercase                 : False\n",
            "2020-02-04 19:35:03,901 cfg.data.max_sent_length           : 100\n",
            "2020-02-04 19:35:03,902 cfg.data.src_vocab                 : data/enkmb/vocab.txt\n",
            "2020-02-04 19:35:03,902 cfg.data.trg_vocab                 : data/enkmb/vocab.txt\n",
            "2020-02-04 19:35:03,902 cfg.testing.beam_size              : 5\n",
            "2020-02-04 19:35:03,902 cfg.testing.alpha                  : 1.0\n",
            "2020-02-04 19:35:03,902 cfg.training.random_seed           : 42\n",
            "2020-02-04 19:35:03,902 cfg.training.optimizer             : adam\n",
            "2020-02-04 19:35:03,902 cfg.training.normalization         : tokens\n",
            "2020-02-04 19:35:03,902 cfg.training.adam_betas            : [0.9, 0.999]\n",
            "2020-02-04 19:35:03,902 cfg.training.scheduling            : noam\n",
            "2020-02-04 19:35:03,902 cfg.training.learning_rate_factor  : 0.5\n",
            "2020-02-04 19:35:03,902 cfg.training.learning_rate_warmup  : 1000\n",
            "2020-02-04 19:35:03,902 cfg.training.patience              : 8\n",
            "2020-02-04 19:35:03,902 cfg.training.decrease_factor       : 0.7\n",
            "2020-02-04 19:35:03,902 cfg.training.loss                  : crossentropy\n",
            "2020-02-04 19:35:03,902 cfg.training.learning_rate         : 0.0002\n",
            "2020-02-04 19:35:03,902 cfg.training.learning_rate_min     : 1e-08\n",
            "2020-02-04 19:35:03,902 cfg.training.weight_decay          : 0.0\n",
            "2020-02-04 19:35:03,902 cfg.training.label_smoothing       : 0.1\n",
            "2020-02-04 19:35:03,902 cfg.training.batch_size            : 4096\n",
            "2020-02-04 19:35:03,902 cfg.training.batch_type            : token\n",
            "2020-02-04 19:35:03,902 cfg.training.eval_batch_size       : 3600\n",
            "2020-02-04 19:35:03,902 cfg.training.eval_batch_type       : token\n",
            "2020-02-04 19:35:03,902 cfg.training.batch_multiplier      : 1\n",
            "2020-02-04 19:35:03,902 cfg.training.early_stopping_metric : eval_metric\n",
            "2020-02-04 19:35:03,902 cfg.training.epochs                : 40\n",
            "2020-02-04 19:35:03,902 cfg.training.validation_freq       : 2000\n",
            "2020-02-04 19:35:03,903 cfg.training.logging_freq          : 200\n",
            "2020-02-04 19:35:03,903 cfg.training.eval_metric           : bleu\n",
            "2020-02-04 19:35:03,903 cfg.training.model_dir             : models/enkmb_transformer\n",
            "2020-02-04 19:35:03,903 cfg.training.overwrite             : True\n",
            "2020-02-04 19:35:03,903 cfg.training.shuffle               : True\n",
            "2020-02-04 19:35:03,903 cfg.training.use_cuda              : True\n",
            "2020-02-04 19:35:03,903 cfg.training.max_output_length     : 100\n",
            "2020-02-04 19:35:03,903 cfg.training.print_valid_sents     : [0, 1, 2, 3]\n",
            "2020-02-04 19:35:03,903 cfg.training.keep_last_ckpts       : 3\n",
            "2020-02-04 19:35:03,903 cfg.model.initializer              : xavier\n",
            "2020-02-04 19:35:03,903 cfg.model.bias_initializer         : zeros\n",
            "2020-02-04 19:35:03,903 cfg.model.init_gain                : 1.0\n",
            "2020-02-04 19:35:03,903 cfg.model.embed_initializer        : xavier\n",
            "2020-02-04 19:35:03,903 cfg.model.embed_init_gain          : 1.0\n",
            "2020-02-04 19:35:03,903 cfg.model.tied_embeddings          : True\n",
            "2020-02-04 19:35:03,903 cfg.model.tied_softmax             : True\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.type             : transformer\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.num_layers       : 6\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.num_heads        : 8\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.embeddings.embedding_dim : 512\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.embeddings.scale : True\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.embeddings.dropout : 0.0\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.hidden_size      : 512\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.ff_size          : 2048\n",
            "2020-02-04 19:35:03,903 cfg.model.encoder.dropout          : 0.3\n",
            "2020-02-04 19:35:03,903 cfg.model.decoder.type             : transformer\n",
            "2020-02-04 19:35:03,903 cfg.model.decoder.num_layers       : 6\n",
            "2020-02-04 19:35:03,903 cfg.model.decoder.num_heads        : 8\n",
            "2020-02-04 19:35:03,903 cfg.model.decoder.embeddings.embedding_dim : 512\n",
            "2020-02-04 19:35:03,904 cfg.model.decoder.embeddings.scale : True\n",
            "2020-02-04 19:35:03,904 cfg.model.decoder.embeddings.dropout : 0.0\n",
            "2020-02-04 19:35:03,904 cfg.model.decoder.hidden_size      : 512\n",
            "2020-02-04 19:35:03,904 cfg.model.decoder.ff_size          : 2048\n",
            "2020-02-04 19:35:03,904 cfg.model.decoder.dropout          : 0.3\n",
            "2020-02-04 19:35:03,904 Data set sizes: \n",
            "\ttrain 75734,\n",
            "\tvalid 1000,\n",
            "\ttest 2693\n",
            "2020-02-04 19:35:03,904 First training example:\n",
            "\t[SRC] some may feel that they do not need anyone to explain the bible to them .\n",
            "\t[TRG] nange sai athu a banza kuila , ka bhingi muthu phala ku a jimb@@ ulu@@ ila o milongi ia bibidia .\n",
            "2020-02-04 19:35:03,904 First 10 words (src): (0) <unk> (1) <pad> (2) <s> (3) </s> (4) , (5) . (6) o (7) ku (8) mu (9) a\n",
            "2020-02-04 19:35:03,904 First 10 words (trg): (0) <unk> (1) <pad> (2) <s> (3) </s> (4) , (5) . (6) o (7) ku (8) mu (9) a\n",
            "2020-02-04 19:35:03,904 Number of Src words (types): 4120\n",
            "2020-02-04 19:35:03,905 Number of Trg words (types): 4120\n",
            "2020-02-04 19:35:03,905 Model(\n",
            "\tencoder=TransformerEncoder(num_layers=6, num_heads=8),\n",
            "\tdecoder=TransformerDecoder(num_layers=6, num_heads=8),\n",
            "\tsrc_embed=Embeddings(embedding_dim=512, vocab_size=4120),\n",
            "\ttrg_embed=Embeddings(embedding_dim=512, vocab_size=4120))\n",
            "2020-02-04 19:35:03,908 EPOCH 1\n",
            "2020-02-04 19:36:07,411 Epoch   1 Step:      200 Batch Loss:     4.373751 Tokens per Sec:     7390, Lr: 0.000140\n",
            "2020-02-04 19:37:16,614 Epoch   1 Step:      400 Batch Loss:     3.446606 Tokens per Sec:     6708, Lr: 0.000280\n",
            "2020-02-04 19:38:25,797 Epoch   1 Step:      600 Batch Loss:     3.172154 Tokens per Sec:     6715, Lr: 0.000419\n",
            "2020-02-04 19:39:09,436 Epoch   1: total training loss 2923.78\n",
            "2020-02-04 19:39:09,436 EPOCH 2\n",
            "2020-02-04 19:39:35,740 Epoch   2 Step:      800 Batch Loss:     3.004483 Tokens per Sec:     6665, Lr: 0.000559\n",
            "2020-02-04 19:40:45,337 Epoch   2 Step:     1000 Batch Loss:     2.668867 Tokens per Sec:     6703, Lr: 0.000699\n",
            "2020-02-04 19:41:55,230 Epoch   2 Step:     1200 Batch Loss:     2.925653 Tokens per Sec:     6683, Lr: 0.000638\n",
            "2020-02-04 19:43:05,191 Epoch   2 Step:     1400 Batch Loss:     2.579067 Tokens per Sec:     6708, Lr: 0.000591\n",
            "2020-02-04 19:43:22,214 Epoch   2: total training loss 1923.77\n",
            "2020-02-04 19:43:22,214 EPOCH 3\n",
            "2020-02-04 19:44:14,693 Epoch   3 Step:     1600 Batch Loss:     2.535758 Tokens per Sec:     6675, Lr: 0.000552\n",
            "2020-02-04 19:45:24,087 Epoch   3 Step:     1800 Batch Loss:     2.087405 Tokens per Sec:     6751, Lr: 0.000521\n",
            "2020-02-04 19:46:33,332 Epoch   3 Step:     2000 Batch Loss:     2.259302 Tokens per Sec:     6709, Lr: 0.000494\n",
            "2020-02-04 19:47:00,454 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 19:47:00,454 Saving new checkpoint.\n",
            "2020-02-04 19:47:02,115 Example #0\n",
            "2020-02-04 19:47:02,116 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 19:47:02,116 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 19:47:02,116 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 19:47:02,116 Example #1\n",
            "2020-02-04 19:47:02,116 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 19:47:02,116 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 19:47:02,117 \tHypothesis: lelu , kuene ima iavulu i tena ku tu kuatekesa .\n",
            "2020-02-04 19:47:02,117 Example #2\n",
            "2020-02-04 19:47:02,117 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 19:47:02,117 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 19:47:02,117 \tHypothesis: o poxolo phaulu ua dimuna o ima i tu bhanga se tu bhanga .\n",
            "2020-02-04 19:47:02,117 Example #3\n",
            "2020-02-04 19:47:02,117 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 19:47:02,117 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 19:47:02,117 \tHypothesis: o ujitu iú , u tena ku tu kuatekesa ku kala ni ukamba uambote ni jihova !\n",
            "2020-02-04 19:47:02,117 Validation result (greedy) at epoch   3, step     2000: bleu:  14.86, loss: 47097.8398, ppl:   8.2633, duration: 28.7850s\n",
            "2020-02-04 19:48:03,094 Epoch   3: total training loss 1615.38\n",
            "2020-02-04 19:48:03,094 EPOCH 4\n",
            "2020-02-04 19:48:11,783 Epoch   4 Step:     2200 Batch Loss:     2.120787 Tokens per Sec:     6459, Lr: 0.000471\n",
            "2020-02-04 19:49:20,971 Epoch   4 Step:     2400 Batch Loss:     1.741877 Tokens per Sec:     6705, Lr: 0.000451\n",
            "2020-02-04 19:50:30,588 Epoch   4 Step:     2600 Batch Loss:     2.465448 Tokens per Sec:     6751, Lr: 0.000433\n",
            "2020-02-04 19:51:39,815 Epoch   4 Step:     2800 Batch Loss:     2.057468 Tokens per Sec:     6758, Lr: 0.000418\n",
            "2020-02-04 19:52:14,527 Epoch   4: total training loss 1468.87\n",
            "2020-02-04 19:52:14,527 EPOCH 5\n",
            "2020-02-04 19:52:48,681 Epoch   5 Step:     3000 Batch Loss:     1.418909 Tokens per Sec:     6716, Lr: 0.000403\n",
            "2020-02-04 19:53:57,999 Epoch   5 Step:     3200 Batch Loss:     1.938244 Tokens per Sec:     6788, Lr: 0.000391\n",
            "2020-02-04 19:55:07,478 Epoch   5 Step:     3400 Batch Loss:     1.893386 Tokens per Sec:     6756, Lr: 0.000379\n",
            "2020-02-04 19:56:16,738 Epoch   5 Step:     3600 Batch Loss:     1.922130 Tokens per Sec:     6749, Lr: 0.000368\n",
            "2020-02-04 19:56:24,670 Epoch   5: total training loss 1360.27\n",
            "2020-02-04 19:56:24,671 EPOCH 6\n",
            "2020-02-04 19:57:26,127 Epoch   6 Step:     3800 Batch Loss:     1.733316 Tokens per Sec:     6793, Lr: 0.000358\n",
            "2020-02-04 19:58:35,178 Epoch   6 Step:     4000 Batch Loss:     1.426156 Tokens per Sec:     6764, Lr: 0.000349\n",
            "2020-02-04 19:59:03,345 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 19:59:03,345 Saving new checkpoint.\n",
            "2020-02-04 19:59:04,958 Example #0\n",
            "2020-02-04 19:59:04,959 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 19:59:04,959 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 19:59:04,959 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 19:59:04,959 Example #1\n",
            "2020-02-04 19:59:04,959 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 19:59:04,959 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 19:59:04,960 \tHypothesis: lelu , kua bhingi dingi ima iavulu , phala ku tu kuatekesa ku dibhana ni maka enhá .\n",
            "2020-02-04 19:59:04,960 Example #2\n",
            "2020-02-04 19:59:04,960 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 19:59:04,960 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 19:59:04,960 \tHypothesis: o poxolo phaulu ua tendelesa o ima ia - nda bhita se tu bhanga o ima ia iibha .\n",
            "2020-02-04 19:59:04,960 Example #3\n",
            "2020-02-04 19:59:04,960 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 19:59:04,960 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 19:59:04,960 \tHypothesis: ujitu ua dikota ku kala ku muenhu mu izuua íii isukidila - ku , ni ku kala ni ukamba uambote ni jihova !\n",
            "2020-02-04 19:59:04,960 Validation result (greedy) at epoch   6, step     4000: bleu:  20.91, loss: 39856.2031, ppl:   5.9722, duration: 29.7820s\n",
            "2020-02-04 20:00:13,886 Epoch   6 Step:     4200 Batch Loss:     1.681094 Tokens per Sec:     6784, Lr: 0.000341\n",
            "2020-02-04 20:01:04,062 Epoch   6: total training loss 1288.14\n",
            "2020-02-04 20:01:04,062 EPOCH 7\n",
            "2020-02-04 20:01:22,592 Epoch   7 Step:     4400 Batch Loss:     1.869816 Tokens per Sec:     6580, Lr: 0.000333\n",
            "2020-02-04 20:02:32,112 Epoch   7 Step:     4600 Batch Loss:     1.868371 Tokens per Sec:     6776, Lr: 0.000326\n",
            "2020-02-04 20:03:40,888 Epoch   7 Step:     4800 Batch Loss:     1.494951 Tokens per Sec:     6731, Lr: 0.000319\n",
            "2020-02-04 20:04:49,964 Epoch   7 Step:     5000 Batch Loss:     1.940855 Tokens per Sec:     6738, Lr: 0.000313\n",
            "2020-02-04 20:05:14,992 Epoch   7: total training loss 1217.81\n",
            "2020-02-04 20:05:14,993 EPOCH 8\n",
            "2020-02-04 20:05:59,350 Epoch   8 Step:     5200 Batch Loss:     1.924535 Tokens per Sec:     6736, Lr: 0.000306\n",
            "2020-02-04 20:07:08,852 Epoch   8 Step:     5400 Batch Loss:     1.800949 Tokens per Sec:     6749, Lr: 0.000301\n",
            "2020-02-04 20:08:17,863 Epoch   8 Step:     5600 Batch Loss:     1.651664 Tokens per Sec:     6767, Lr: 0.000295\n",
            "2020-02-04 20:09:25,274 Epoch   8: total training loss 1154.32\n",
            "2020-02-04 20:09:25,274 EPOCH 9\n",
            "2020-02-04 20:09:26,981 Epoch   9 Step:     5800 Batch Loss:     1.459499 Tokens per Sec:     5603, Lr: 0.000290\n",
            "2020-02-04 20:10:36,310 Epoch   9 Step:     6000 Batch Loss:     1.670126 Tokens per Sec:     6785, Lr: 0.000285\n",
            "2020-02-04 20:11:02,199 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 20:11:02,199 Saving new checkpoint.\n",
            "2020-02-04 20:11:03,924 Example #0\n",
            "2020-02-04 20:11:03,924 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 20:11:03,924 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:11:03,924 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:11:03,924 Example #1\n",
            "2020-02-04 20:11:03,925 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 20:11:03,925 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 20:11:03,925 \tHypothesis: lelu , kua kambe ngó bhofele , ande dia ku bhita o ima i tena ku tu landukisa .\n",
            "2020-02-04 20:11:03,925 Example #2\n",
            "2020-02-04 20:11:03,925 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 20:11:03,925 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 20:11:03,925 \tHypothesis: o poxolo phaulu ua dimuna ia lungu ni ima i tu bhita na - iu , se tu bhanga o ima i tua mesena .\n",
            "2020-02-04 20:11:03,925 Example #3\n",
            "2020-02-04 20:11:03,925 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 20:11:03,925 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 20:11:03,925 \tHypothesis: ujitu ua dikota ku kala mu izuua isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 20:11:03,925 Validation result (greedy) at epoch   9, step     6000: bleu:  23.81, loss: 37121.8281, ppl:   5.2831, duration: 27.6148s\n",
            "2020-02-04 20:12:12,509 Epoch   9 Step:     6200 Batch Loss:     1.229483 Tokens per Sec:     6730, Lr: 0.000281\n",
            "2020-02-04 20:13:21,856 Epoch   9 Step:     6400 Batch Loss:     1.410562 Tokens per Sec:     6790, Lr: 0.000276\n",
            "2020-02-04 20:14:03,446 Epoch   9: total training loss 1114.11\n",
            "2020-02-04 20:14:03,446 EPOCH 10\n",
            "2020-02-04 20:14:31,209 Epoch  10 Step:     6600 Batch Loss:     1.540644 Tokens per Sec:     6716, Lr: 0.000272\n",
            "2020-02-04 20:15:39,776 Epoch  10 Step:     6800 Batch Loss:     1.627514 Tokens per Sec:     6744, Lr: 0.000268\n",
            "2020-02-04 20:16:49,088 Epoch  10 Step:     7000 Batch Loss:     1.325339 Tokens per Sec:     6780, Lr: 0.000264\n",
            "2020-02-04 20:17:58,791 Epoch  10 Step:     7200 Batch Loss:     1.359054 Tokens per Sec:     6726, Lr: 0.000260\n",
            "2020-02-04 20:18:13,983 Epoch  10: total training loss 1072.73\n",
            "2020-02-04 20:18:13,983 EPOCH 11\n",
            "2020-02-04 20:19:07,831 Epoch  11 Step:     7400 Batch Loss:     1.427467 Tokens per Sec:     6722, Lr: 0.000257\n",
            "2020-02-04 20:20:17,024 Epoch  11 Step:     7600 Batch Loss:     1.435952 Tokens per Sec:     6733, Lr: 0.000253\n",
            "2020-02-04 20:21:26,239 Epoch  11 Step:     7800 Batch Loss:     1.629339 Tokens per Sec:     6752, Lr: 0.000250\n",
            "2020-02-04 20:22:24,633 Epoch  11: total training loss 1036.26\n",
            "2020-02-04 20:22:24,633 EPOCH 12\n",
            "2020-02-04 20:22:35,411 Epoch  12 Step:     8000 Batch Loss:     1.378333 Tokens per Sec:     6642, Lr: 0.000247\n",
            "2020-02-04 20:23:01,670 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 20:23:01,670 Saving new checkpoint.\n",
            "2020-02-04 20:23:03,279 Example #0\n",
            "2020-02-04 20:23:03,280 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 20:23:03,280 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:23:03,280 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:23:03,280 Example #1\n",
            "2020-02-04 20:23:03,280 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 20:23:03,280 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 20:23:03,280 \tHypothesis: lelu , kua kambe ngó bhofele , ande dia ku bhita o ima ioso , saí ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 20:23:03,280 Example #2\n",
            "2020-02-04 20:23:03,280 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 20:23:03,280 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 20:23:03,280 \tHypothesis: o poxolo phaulu ua dimuna o ima i tu bhita , se tu bhanga o ima i tua mesena .\n",
            "2020-02-04 20:23:03,280 Example #3\n",
            "2020-02-04 20:23:03,281 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 20:23:03,281 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 20:23:03,281 \tHypothesis: ujitu ua dikota ku kala mu izuua isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 20:23:03,281 Validation result (greedy) at epoch  12, step     8000: bleu:  25.00, loss: 35674.4961, ppl:   4.9511, duration: 27.8696s\n",
            "2020-02-04 20:24:12,367 Epoch  12 Step:     8200 Batch Loss:     1.231285 Tokens per Sec:     6702, Lr: 0.000244\n",
            "2020-02-04 20:25:21,048 Epoch  12 Step:     8400 Batch Loss:     1.578112 Tokens per Sec:     6722, Lr: 0.000241\n",
            "2020-02-04 20:26:30,454 Epoch  12 Step:     8600 Batch Loss:     1.382758 Tokens per Sec:     6747, Lr: 0.000238\n",
            "2020-02-04 20:27:03,588 Epoch  12: total training loss 1010.16\n",
            "2020-02-04 20:27:03,589 EPOCH 13\n",
            "2020-02-04 20:27:39,888 Epoch  13 Step:     8800 Batch Loss:     1.088220 Tokens per Sec:     6760, Lr: 0.000236\n",
            "2020-02-04 20:28:49,663 Epoch  13 Step:     9000 Batch Loss:     1.332197 Tokens per Sec:     6763, Lr: 0.000233\n",
            "2020-02-04 20:29:58,891 Epoch  13 Step:     9200 Batch Loss:     1.640509 Tokens per Sec:     6770, Lr: 0.000230\n",
            "2020-02-04 20:31:08,391 Epoch  13 Step:     9400 Batch Loss:     1.451203 Tokens per Sec:     6703, Lr: 0.000228\n",
            "2020-02-04 20:31:14,434 Epoch  13: total training loss 970.57\n",
            "2020-02-04 20:31:14,434 EPOCH 14\n",
            "2020-02-04 20:32:17,273 Epoch  14 Step:     9600 Batch Loss:     1.308440 Tokens per Sec:     6706, Lr: 0.000226\n",
            "2020-02-04 20:33:25,976 Epoch  14 Step:     9800 Batch Loss:     1.202386 Tokens per Sec:     6760, Lr: 0.000223\n",
            "2020-02-04 20:34:35,241 Epoch  14 Step:    10000 Batch Loss:     1.115765 Tokens per Sec:     6750, Lr: 0.000221\n",
            "2020-02-04 20:34:57,532 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 20:34:57,532 Saving new checkpoint.\n",
            "2020-02-04 20:34:59,217 Example #0\n",
            "2020-02-04 20:34:59,217 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 20:34:59,217 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:34:59,217 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:34:59,217 Example #1\n",
            "2020-02-04 20:34:59,218 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 20:34:59,218 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 20:34:59,218 \tHypothesis: lelu , kua kambe ngó , ande dia ku bhita o ima ioso , saí ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 20:34:59,218 Example #2\n",
            "2020-02-04 20:34:59,218 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 20:34:59,218 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 20:34:59,218 \tHypothesis: o poxolo phaulu ua dimuna o ima i tena kubhita , se tu i sangulukisa .\n",
            "2020-02-04 20:34:59,218 Example #3\n",
            "2020-02-04 20:34:59,218 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 20:34:59,218 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 20:34:59,218 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 20:34:59,219 Validation result (greedy) at epoch  14, step    10000: bleu:  26.10, loss: 34720.7383, ppl:   4.7438, duration: 23.9776s\n",
            "2020-02-04 20:35:48,955 Epoch  14: total training loss 946.58\n",
            "2020-02-04 20:35:48,955 EPOCH 15\n",
            "2020-02-04 20:36:08,972 Epoch  15 Step:    10200 Batch Loss:     1.149951 Tokens per Sec:     6741, Lr: 0.000219\n",
            "2020-02-04 20:37:18,232 Epoch  15 Step:    10400 Batch Loss:     1.112914 Tokens per Sec:     6765, Lr: 0.000217\n",
            "2020-02-04 20:38:27,355 Epoch  15 Step:    10600 Batch Loss:     1.485635 Tokens per Sec:     6714, Lr: 0.000215\n",
            "2020-02-04 20:39:36,381 Epoch  15 Step:    10800 Batch Loss:     1.262681 Tokens per Sec:     6753, Lr: 0.000213\n",
            "2020-02-04 20:39:59,681 Epoch  15: total training loss 917.70\n",
            "2020-02-04 20:39:59,681 EPOCH 16\n",
            "2020-02-04 20:40:45,804 Epoch  16 Step:    11000 Batch Loss:     1.141444 Tokens per Sec:     6700, Lr: 0.000211\n",
            "2020-02-04 20:41:54,926 Epoch  16 Step:    11200 Batch Loss:     1.221109 Tokens per Sec:     6773, Lr: 0.000209\n",
            "2020-02-04 20:43:03,746 Epoch  16 Step:    11400 Batch Loss:     1.410407 Tokens per Sec:     6695, Lr: 0.000207\n",
            "2020-02-04 20:44:10,887 Epoch  16: total training loss 897.40\n",
            "2020-02-04 20:44:10,887 EPOCH 17\n",
            "2020-02-04 20:44:12,739 Epoch  17 Step:    11600 Batch Loss:     1.301653 Tokens per Sec:     6571, Lr: 0.000205\n",
            "2020-02-04 20:45:22,036 Epoch  17 Step:    11800 Batch Loss:     1.178325 Tokens per Sec:     6740, Lr: 0.000203\n",
            "2020-02-04 20:46:31,137 Epoch  17 Step:    12000 Batch Loss:     1.176379 Tokens per Sec:     6731, Lr: 0.000202\n",
            "2020-02-04 20:47:03,313 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 20:47:03,314 Saving new checkpoint.\n",
            "2020-02-04 20:47:05,095 Example #0\n",
            "2020-02-04 20:47:05,095 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 20:47:05,095 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:47:05,095 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:47:05,095 Example #1\n",
            "2020-02-04 20:47:05,095 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 20:47:05,095 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 20:47:05,096 \tHypothesis: lelu , ande dia ku bhita o ima , saí ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 20:47:05,096 Example #2\n",
            "2020-02-04 20:47:05,096 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 20:47:05,096 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 20:47:05,096 \tHypothesis: o poxolo phaulu ua dimuna o ima i tu tena o kubhanga , se tu i sangulukisa o muxima ua nzambi .\n",
            "2020-02-04 20:47:05,096 Example #3\n",
            "2020-02-04 20:47:05,096 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 20:47:05,096 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 20:47:05,096 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 20:47:05,096 Validation result (greedy) at epoch  17, step    12000: bleu:  26.16, loss: 34417.0859, ppl:   4.6797, duration: 33.9585s\n",
            "2020-02-04 20:48:14,113 Epoch  17 Step:    12200 Batch Loss:     1.303897 Tokens per Sec:     6802, Lr: 0.000200\n",
            "2020-02-04 20:48:55,324 Epoch  17: total training loss 872.81\n",
            "2020-02-04 20:48:55,324 EPOCH 18\n",
            "2020-02-04 20:49:23,275 Epoch  18 Step:    12400 Batch Loss:     1.313688 Tokens per Sec:     6647, Lr: 0.000198\n",
            "2020-02-04 20:50:32,802 Epoch  18 Step:    12600 Batch Loss:     1.247998 Tokens per Sec:     6754, Lr: 0.000197\n",
            "2020-02-04 20:51:42,179 Epoch  18 Step:    12800 Batch Loss:     1.266422 Tokens per Sec:     6754, Lr: 0.000195\n",
            "2020-02-04 20:52:51,013 Epoch  18 Step:    13000 Batch Loss:     1.123238 Tokens per Sec:     6736, Lr: 0.000194\n",
            "2020-02-04 20:53:06,337 Epoch  18: total training loss 851.96\n",
            "2020-02-04 20:53:06,337 EPOCH 19\n",
            "2020-02-04 20:53:59,248 Epoch  19 Step:    13200 Batch Loss:     0.979365 Tokens per Sec:     6745, Lr: 0.000192\n",
            "2020-02-04 20:55:08,508 Epoch  19 Step:    13400 Batch Loss:     1.147491 Tokens per Sec:     6792, Lr: 0.000191\n",
            "2020-02-04 20:56:17,426 Epoch  19 Step:    13600 Batch Loss:     1.091969 Tokens per Sec:     6755, Lr: 0.000189\n",
            "2020-02-04 20:57:16,512 Epoch  19: total training loss 832.52\n",
            "2020-02-04 20:57:16,512 EPOCH 20\n",
            "2020-02-04 20:57:26,043 Epoch  20 Step:    13800 Batch Loss:     1.193934 Tokens per Sec:     6556, Lr: 0.000188\n",
            "2020-02-04 20:58:34,991 Epoch  20 Step:    14000 Batch Loss:     1.229089 Tokens per Sec:     6751, Lr: 0.000187\n",
            "2020-02-04 20:59:00,784 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 20:59:00,785 Saving new checkpoint.\n",
            "2020-02-04 20:59:02,527 Example #0\n",
            "2020-02-04 20:59:02,527 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 20:59:02,527 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:59:02,527 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 20:59:02,527 Example #1\n",
            "2020-02-04 20:59:02,527 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 20:59:02,527 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 20:59:02,528 \tHypothesis: lelu , ande dia ku bhita o ima , saí ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 20:59:02,528 Example #2\n",
            "2020-02-04 20:59:02,528 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 20:59:02,528 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 20:59:02,528 \tHypothesis: o poxolo phaulu ua dimuna o ima i tena kubhita , se tu i sangulukisa o muxima uetu .\n",
            "2020-02-04 20:59:02,528 Example #3\n",
            "2020-02-04 20:59:02,528 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 20:59:02,528 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 20:59:02,528 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 20:59:02,528 Validation result (greedy) at epoch  20, step    14000: bleu:  27.21, loss: 34284.7305, ppl:   4.6520, duration: 27.5373s\n",
            "2020-02-04 21:00:11,822 Epoch  20 Step:    14200 Batch Loss:     1.060636 Tokens per Sec:     6718, Lr: 0.000185\n",
            "2020-02-04 21:01:20,983 Epoch  20 Step:    14400 Batch Loss:     1.281659 Tokens per Sec:     6710, Lr: 0.000184\n",
            "2020-02-04 21:01:55,348 Epoch  20: total training loss 815.14\n",
            "2020-02-04 21:01:55,348 EPOCH 21\n",
            "2020-02-04 21:02:30,501 Epoch  21 Step:    14600 Batch Loss:     1.254682 Tokens per Sec:     6780, Lr: 0.000183\n",
            "2020-02-04 21:03:40,040 Epoch  21 Step:    14800 Batch Loss:     1.393568 Tokens per Sec:     6728, Lr: 0.000182\n",
            "2020-02-04 21:04:48,713 Epoch  21 Step:    15000 Batch Loss:     0.893659 Tokens per Sec:     6724, Lr: 0.000180\n",
            "2020-02-04 21:05:57,599 Epoch  21 Step:    15200 Batch Loss:     0.912125 Tokens per Sec:     6745, Lr: 0.000179\n",
            "2020-02-04 21:06:06,220 Epoch  21: total training loss 794.84\n",
            "2020-02-04 21:06:06,220 EPOCH 22\n",
            "2020-02-04 21:07:06,241 Epoch  22 Step:    15400 Batch Loss:     1.034763 Tokens per Sec:     6755, Lr: 0.000178\n",
            "2020-02-04 21:08:14,942 Epoch  22 Step:    15600 Batch Loss:     1.081931 Tokens per Sec:     6748, Lr: 0.000177\n",
            "2020-02-04 21:09:24,367 Epoch  22 Step:    15800 Batch Loss:     1.300140 Tokens per Sec:     6756, Lr: 0.000176\n",
            "2020-02-04 21:10:16,419 Epoch  22: total training loss 776.06\n",
            "2020-02-04 21:10:16,419 EPOCH 23\n",
            "2020-02-04 21:10:33,331 Epoch  23 Step:    16000 Batch Loss:     1.199994 Tokens per Sec:     6680, Lr: 0.000175\n",
            "2020-02-04 21:11:06,115 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 21:11:06,115 Saving new checkpoint.\n",
            "2020-02-04 21:11:07,697 Example #0\n",
            "2020-02-04 21:11:07,698 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 21:11:07,698 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:11:07,698 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:11:07,698 Example #1\n",
            "2020-02-04 21:11:07,698 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 21:11:07,698 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 21:11:07,698 \tHypothesis: lelu , kua kambe ngó bhofele , ande dia ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 21:11:07,698 Example #2\n",
            "2020-02-04 21:11:07,698 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 21:11:07,698 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 21:11:07,698 \tHypothesis: o poxolo phaulu ua dimuna o ima i tena kubhita , se tu mesena ku ta o kituxi .\n",
            "2020-02-04 21:11:07,698 Example #3\n",
            "2020-02-04 21:11:07,699 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 21:11:07,699 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 21:11:07,699 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 21:11:07,699 Validation result (greedy) at epoch  23, step    16000: bleu:  27.29, loss: 34352.4297, ppl:   4.6661, duration: 34.3671s\n",
            "2020-02-04 21:12:16,709 Epoch  23 Step:    16200 Batch Loss:     0.907302 Tokens per Sec:     6697, Lr: 0.000174\n",
            "2020-02-04 21:13:26,235 Epoch  23 Step:    16400 Batch Loss:     1.082953 Tokens per Sec:     6728, Lr: 0.000173\n",
            "2020-02-04 21:14:35,507 Epoch  23 Step:    16600 Batch Loss:     0.971708 Tokens per Sec:     6791, Lr: 0.000172\n",
            "2020-02-04 21:15:01,924 Epoch  23: total training loss 760.18\n",
            "2020-02-04 21:15:01,924 EPOCH 24\n",
            "2020-02-04 21:15:44,333 Epoch  24 Step:    16800 Batch Loss:     0.790211 Tokens per Sec:     6703, Lr: 0.000170\n",
            "2020-02-04 21:16:53,763 Epoch  24 Step:    17000 Batch Loss:     1.323298 Tokens per Sec:     6747, Lr: 0.000169\n",
            "2020-02-04 21:18:03,061 Epoch  24 Step:    17200 Batch Loss:     1.141842 Tokens per Sec:     6748, Lr: 0.000168\n",
            "2020-02-04 21:19:12,428 Epoch  24 Step:    17400 Batch Loss:     1.137136 Tokens per Sec:     6745, Lr: 0.000168\n",
            "2020-02-04 21:19:12,779 Epoch  24: total training loss 742.43\n",
            "2020-02-04 21:19:12,779 EPOCH 25\n",
            "2020-02-04 21:20:22,121 Epoch  25 Step:    17600 Batch Loss:     1.002455 Tokens per Sec:     6736, Lr: 0.000167\n",
            "2020-02-04 21:21:31,568 Epoch  25 Step:    17800 Batch Loss:     1.016481 Tokens per Sec:     6740, Lr: 0.000166\n",
            "2020-02-04 21:22:40,598 Epoch  25 Step:    18000 Batch Loss:     0.992400 Tokens per Sec:     6755, Lr: 0.000165\n",
            "2020-02-04 21:23:06,908 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 21:23:06,909 Saving new checkpoint.\n",
            "2020-02-04 21:23:08,660 Example #0\n",
            "2020-02-04 21:23:08,660 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 21:23:08,660 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:23:08,660 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:23:08,660 Example #1\n",
            "2020-02-04 21:23:08,661 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 21:23:08,661 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 21:23:08,661 \tHypothesis: lelu , ande dia ku bhita o ima , saí ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 21:23:08,661 Example #2\n",
            "2020-02-04 21:23:08,661 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 21:23:08,661 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 21:23:08,661 \tHypothesis: o poxolo phaulu ua dimuna o ima i tu tena o kubhanga , se tu i sangulukisa o muxima ua nzambi .\n",
            "2020-02-04 21:23:08,661 Example #3\n",
            "2020-02-04 21:23:08,661 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 21:23:08,661 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 21:23:08,661 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 21:23:08,661 Validation result (greedy) at epoch  25, step    18000: bleu:  27.56, loss: 34467.0391, ppl:   4.6902, duration: 28.0627s\n",
            "2020-02-04 21:23:51,963 Epoch  25: total training loss 728.06\n",
            "2020-02-04 21:23:51,963 EPOCH 26\n",
            "2020-02-04 21:24:17,140 Epoch  26 Step:    18200 Batch Loss:     1.026593 Tokens per Sec:     6702, Lr: 0.000164\n",
            "2020-02-04 21:25:26,300 Epoch  26 Step:    18400 Batch Loss:     0.830519 Tokens per Sec:     6723, Lr: 0.000163\n",
            "2020-02-04 21:26:35,436 Epoch  26 Step:    18600 Batch Loss:     0.875107 Tokens per Sec:     6723, Lr: 0.000162\n",
            "2020-02-04 21:27:45,162 Epoch  26 Step:    18800 Batch Loss:     1.146535 Tokens per Sec:     6752, Lr: 0.000161\n",
            "2020-02-04 21:28:03,264 Epoch  26: total training loss 715.50\n",
            "2020-02-04 21:28:03,265 EPOCH 27\n",
            "2020-02-04 21:28:54,111 Epoch  27 Step:    19000 Batch Loss:     1.102405 Tokens per Sec:     6734, Lr: 0.000160\n",
            "2020-02-04 21:30:03,059 Epoch  27 Step:    19200 Batch Loss:     0.899686 Tokens per Sec:     6778, Lr: 0.000159\n",
            "2020-02-04 21:31:12,257 Epoch  27 Step:    19400 Batch Loss:     1.039293 Tokens per Sec:     6745, Lr: 0.000159\n",
            "2020-02-04 21:32:13,601 Epoch  27: total training loss 699.15\n",
            "2020-02-04 21:32:13,601 EPOCH 28\n",
            "2020-02-04 21:32:21,225 Epoch  28 Step:    19600 Batch Loss:     0.930306 Tokens per Sec:     6531, Lr: 0.000158\n",
            "2020-02-04 21:33:30,177 Epoch  28 Step:    19800 Batch Loss:     0.697419 Tokens per Sec:     6700, Lr: 0.000157\n",
            "2020-02-04 21:34:39,432 Epoch  28 Step:    20000 Batch Loss:     1.109810 Tokens per Sec:     6770, Lr: 0.000156\n",
            "2020-02-04 21:35:04,403 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 21:35:04,404 Saving new checkpoint.\n",
            "2020-02-04 21:35:06,124 Example #0\n",
            "2020-02-04 21:35:06,124 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 21:35:06,124 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:35:06,124 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:35:06,124 Example #1\n",
            "2020-02-04 21:35:06,125 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 21:35:06,125 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 21:35:06,125 \tHypothesis: lelu , ande dia ku bhita o ima , saí ima i tena ku tu landukisa .\n",
            "2020-02-04 21:35:06,125 Example #2\n",
            "2020-02-04 21:35:06,125 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 21:35:06,125 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 21:35:06,125 \tHypothesis: o poxolo phaulu ua dimuna o ima i tua tokala o kubhanga , se tu dióndo tua - nda dióndo .\n",
            "2020-02-04 21:35:06,125 Example #3\n",
            "2020-02-04 21:35:06,125 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 21:35:06,125 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 21:35:06,125 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova ku mundu uoso !\n",
            "2020-02-04 21:35:06,125 Validation result (greedy) at epoch  28, step    20000: bleu:  27.83, loss: 34512.3594, ppl:   4.6997, duration: 26.6926s\n",
            "2020-02-04 21:36:15,361 Epoch  28 Step:    20200 Batch Loss:     1.001297 Tokens per Sec:     6779, Lr: 0.000155\n",
            "2020-02-04 21:36:51,103 Epoch  28: total training loss 685.74\n",
            "2020-02-04 21:36:51,104 EPOCH 29\n",
            "2020-02-04 21:37:24,494 Epoch  29 Step:    20400 Batch Loss:     0.780829 Tokens per Sec:     6656, Lr: 0.000155\n",
            "2020-02-04 21:38:34,375 Epoch  29 Step:    20600 Batch Loss:     1.058090 Tokens per Sec:     6789, Lr: 0.000154\n",
            "2020-02-04 21:39:43,651 Epoch  29 Step:    20800 Batch Loss:     1.022833 Tokens per Sec:     6713, Lr: 0.000153\n",
            "2020-02-04 21:40:52,962 Epoch  29 Step:    21000 Batch Loss:     1.021863 Tokens per Sec:     6729, Lr: 0.000152\n",
            "2020-02-04 21:41:02,519 Epoch  29: total training loss 672.60\n",
            "2020-02-04 21:41:02,520 EPOCH 30\n",
            "2020-02-04 21:42:02,234 Epoch  30 Step:    21200 Batch Loss:     1.052732 Tokens per Sec:     6722, Lr: 0.000152\n",
            "2020-02-04 21:43:11,251 Epoch  30 Step:    21400 Batch Loss:     0.706357 Tokens per Sec:     6738, Lr: 0.000151\n",
            "2020-02-04 21:44:20,417 Epoch  30 Step:    21600 Batch Loss:     1.044521 Tokens per Sec:     6757, Lr: 0.000150\n",
            "2020-02-04 21:45:12,997 Epoch  30: total training loss 660.55\n",
            "2020-02-04 21:45:12,998 EPOCH 31\n",
            "2020-02-04 21:45:29,078 Epoch  31 Step:    21800 Batch Loss:     0.909594 Tokens per Sec:     6664, Lr: 0.000150\n",
            "2020-02-04 21:46:38,634 Epoch  31 Step:    22000 Batch Loss:     1.052121 Tokens per Sec:     6702, Lr: 0.000149\n",
            "2020-02-04 21:47:04,968 Example #0\n",
            "2020-02-04 21:47:04,968 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 21:47:04,968 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:47:04,968 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:47:04,968 Example #1\n",
            "2020-02-04 21:47:04,968 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 21:47:04,968 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 21:47:04,968 \tHypothesis: lelu , ande dia kubhita o ima ioso , sai ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 21:47:04,969 Example #2\n",
            "2020-02-04 21:47:04,969 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 21:47:04,969 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 21:47:04,969 \tHypothesis: o poxolo phaulu ua dimuna o ima i tena ku bhita , se tu bhanga o ima i tua mesena .\n",
            "2020-02-04 21:47:04,969 Example #3\n",
            "2020-02-04 21:47:04,969 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 21:47:04,969 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 21:47:04,969 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 21:47:04,969 Validation result (greedy) at epoch  31, step    22000: bleu:  27.66, loss: 34950.4023, ppl:   4.7929, duration: 26.3353s\n",
            "2020-02-04 21:48:13,895 Epoch  31 Step:    22200 Batch Loss:     0.803242 Tokens per Sec:     6752, Lr: 0.000148\n",
            "2020-02-04 21:49:23,200 Epoch  31 Step:    22400 Batch Loss:     1.006292 Tokens per Sec:     6735, Lr: 0.000148\n",
            "2020-02-04 21:49:50,363 Epoch  31: total training loss 649.05\n",
            "2020-02-04 21:49:50,363 EPOCH 32\n",
            "2020-02-04 21:50:32,894 Epoch  32 Step:    22600 Batch Loss:     1.001542 Tokens per Sec:     6767, Lr: 0.000147\n",
            "2020-02-04 21:51:41,897 Epoch  32 Step:    22800 Batch Loss:     1.021521 Tokens per Sec:     6709, Lr: 0.000146\n",
            "2020-02-04 21:52:51,202 Epoch  32 Step:    23000 Batch Loss:     1.034444 Tokens per Sec:     6747, Lr: 0.000146\n",
            "2020-02-04 21:54:00,508 Epoch  32 Step:    23200 Batch Loss:     0.662240 Tokens per Sec:     6760, Lr: 0.000145\n",
            "2020-02-04 21:54:01,151 Epoch  32: total training loss 636.39\n",
            "2020-02-04 21:54:01,151 EPOCH 33\n",
            "2020-02-04 21:55:09,802 Epoch  33 Step:    23400 Batch Loss:     0.973105 Tokens per Sec:     6725, Lr: 0.000144\n",
            "2020-02-04 21:56:19,202 Epoch  33 Step:    23600 Batch Loss:     0.733882 Tokens per Sec:     6752, Lr: 0.000144\n",
            "2020-02-04 21:57:27,851 Epoch  33 Step:    23800 Batch Loss:     0.902763 Tokens per Sec:     6733, Lr: 0.000143\n",
            "2020-02-04 21:58:11,830 Epoch  33: total training loss 626.07\n",
            "2020-02-04 21:58:11,830 EPOCH 34\n",
            "2020-02-04 21:58:37,167 Epoch  34 Step:    24000 Batch Loss:     0.886888 Tokens per Sec:     6783, Lr: 0.000143\n",
            "2020-02-04 21:59:03,626 Example #0\n",
            "2020-02-04 21:59:03,626 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 21:59:03,626 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:59:03,626 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 21:59:03,627 Example #1\n",
            "2020-02-04 21:59:03,627 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 21:59:03,627 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 21:59:03,627 \tHypothesis: lelu , ande dia ima iavulu ku tu landukisa .\n",
            "2020-02-04 21:59:03,627 Example #2\n",
            "2020-02-04 21:59:03,627 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 21:59:03,627 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 21:59:03,627 \tHypothesis: o poxolo phaulu ua dimuna o ima i tu tena o kubhanga , se tu bhanga o ima i tua mesena ia tu sangulukisa .\n",
            "2020-02-04 21:59:03,627 Example #3\n",
            "2020-02-04 21:59:03,627 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 21:59:03,627 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 21:59:03,628 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova mu ngongo ioso !\n",
            "2020-02-04 21:59:03,628 Validation result (greedy) at epoch  34, step    24000: bleu:  27.81, loss: 35239.2383, ppl:   4.8554, duration: 26.4609s\n",
            "2020-02-04 22:00:12,651 Epoch  34 Step:    24200 Batch Loss:     0.735904 Tokens per Sec:     6769, Lr: 0.000142\n",
            "2020-02-04 22:01:21,751 Epoch  34 Step:    24400 Batch Loss:     0.764096 Tokens per Sec:     6725, Lr: 0.000141\n",
            "2020-02-04 22:02:31,084 Epoch  34 Step:    24600 Batch Loss:     0.675966 Tokens per Sec:     6724, Lr: 0.000141\n",
            "2020-02-04 22:02:48,997 Epoch  34: total training loss 615.29\n",
            "2020-02-04 22:02:48,997 EPOCH 35\n",
            "2020-02-04 22:03:40,033 Epoch  35 Step:    24800 Batch Loss:     0.966231 Tokens per Sec:     6778, Lr: 0.000140\n",
            "2020-02-04 22:04:49,458 Epoch  35 Step:    25000 Batch Loss:     1.004890 Tokens per Sec:     6685, Lr: 0.000140\n",
            "2020-02-04 22:05:58,423 Epoch  35 Step:    25200 Batch Loss:     0.799023 Tokens per Sec:     6731, Lr: 0.000139\n",
            "2020-02-04 22:06:59,902 Epoch  35: total training loss 604.89\n",
            "2020-02-04 22:06:59,902 EPOCH 36\n",
            "2020-02-04 22:07:07,737 Epoch  36 Step:    25400 Batch Loss:     0.851759 Tokens per Sec:     6783, Lr: 0.000139\n",
            "2020-02-04 22:08:16,531 Epoch  36 Step:    25600 Batch Loss:     0.867299 Tokens per Sec:     6737, Lr: 0.000138\n",
            "2020-02-04 22:09:25,729 Epoch  36 Step:    25800 Batch Loss:     0.928555 Tokens per Sec:     6769, Lr: 0.000138\n",
            "2020-02-04 22:10:34,258 Epoch  36 Step:    26000 Batch Loss:     0.984406 Tokens per Sec:     6720, Lr: 0.000137\n",
            "2020-02-04 22:11:00,673 Hooray! New best validation result [eval_metric]!\n",
            "2020-02-04 22:11:00,673 Saving new checkpoint.\n",
            "2020-02-04 22:11:02,345 Example #0\n",
            "2020-02-04 22:11:02,345 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 22:11:02,345 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 22:11:02,346 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 22:11:02,346 Example #1\n",
            "2020-02-04 22:11:02,346 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 22:11:02,346 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 22:11:02,346 \tHypothesis: lelu , m’ukulu , kuene ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 22:11:02,346 Example #2\n",
            "2020-02-04 22:11:02,346 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 22:11:02,346 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 22:11:02,347 \tHypothesis: o poxolo phaulu ua dimuna o ima i tu tena kubhanga , se tu dióndo dianga kusota o kitambuijilu kia poxolo .\n",
            "2020-02-04 22:11:02,347 Example #3\n",
            "2020-02-04 22:11:02,347 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 22:11:02,347 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 22:11:02,347 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova mu kilunga kiê !\n",
            "2020-02-04 22:11:02,347 Validation result (greedy) at epoch  36, step    26000: bleu:  28.57, loss: 35323.6875, ppl:   4.8738, duration: 28.0886s\n",
            "2020-02-04 22:11:38,789 Epoch  36: total training loss 595.59\n",
            "2020-02-04 22:11:38,789 EPOCH 37\n",
            "2020-02-04 22:12:11,812 Epoch  37 Step:    26200 Batch Loss:     0.837748 Tokens per Sec:     6688, Lr: 0.000137\n",
            "2020-02-04 22:13:20,839 Epoch  37 Step:    26400 Batch Loss:     0.885409 Tokens per Sec:     6753, Lr: 0.000136\n",
            "2020-02-04 22:14:29,896 Epoch  37 Step:    26600 Batch Loss:     0.945491 Tokens per Sec:     6753, Lr: 0.000135\n",
            "2020-02-04 22:15:38,565 Epoch  37 Step:    26800 Batch Loss:     0.869143 Tokens per Sec:     6750, Lr: 0.000135\n",
            "2020-02-04 22:15:49,475 Epoch  37: total training loss 585.76\n",
            "2020-02-04 22:15:49,475 EPOCH 38\n",
            "2020-02-04 22:16:48,111 Epoch  38 Step:    27000 Batch Loss:     0.751874 Tokens per Sec:     6739, Lr: 0.000134\n",
            "2020-02-04 22:17:57,532 Epoch  38 Step:    27200 Batch Loss:     0.771263 Tokens per Sec:     6784, Lr: 0.000134\n",
            "2020-02-04 22:19:06,625 Epoch  38 Step:    27400 Batch Loss:     0.903140 Tokens per Sec:     6732, Lr: 0.000133\n",
            "2020-02-04 22:19:59,952 Epoch  38: total training loss 574.80\n",
            "2020-02-04 22:19:59,952 EPOCH 39\n",
            "2020-02-04 22:20:15,691 Epoch  39 Step:    27600 Batch Loss:     0.824561 Tokens per Sec:     6609, Lr: 0.000133\n",
            "2020-02-04 22:21:24,953 Epoch  39 Step:    27800 Batch Loss:     0.907379 Tokens per Sec:     6778, Lr: 0.000133\n",
            "2020-02-04 22:22:33,948 Epoch  39 Step:    28000 Batch Loss:     0.561887 Tokens per Sec:     6718, Lr: 0.000132\n",
            "2020-02-04 22:22:59,746 Example #0\n",
            "2020-02-04 22:22:59,747 \tSource:     ( read philippians 2 : 5 - 8 . )\n",
            "2020-02-04 22:22:59,747 \tReference:  ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 22:22:59,747 \tHypothesis: ( tanga filipe 2 : 5 - 8 . )\n",
            "2020-02-04 22:22:59,747 Example #1\n",
            "2020-02-04 22:22:59,747 \tSource:     today , more than ever before , there are so many things that can distract us .\n",
            "2020-02-04 22:22:59,747 \tReference:  lelu , kuene ima iavulu i tena ku tu landukisa , m’ukulu ndenge .\n",
            "2020-02-04 22:22:59,747 \tHypothesis: lelu , ande dia ima iavulu ku bhita , saí ima iavulu i tena ku tu landukisa .\n",
            "2020-02-04 22:22:59,747 Example #2\n",
            "2020-02-04 22:22:59,747 \tSource:     the apostle paul warned about what can happen if we please ourselves first .\n",
            "2020-02-04 22:22:59,747 \tReference:  ( 1 nzuá 2 : 16 ) o poxolo phaulu ua tu dimuna ku ima i tena kubhita se tu suua ngó o ima i tua uabhela .\n",
            "2020-02-04 22:22:59,747 \tHypothesis: o poxolo phaulu ua dimuna o ima i tena kubhita , se tu bhanga o ima i tua mesena .\n",
            "2020-02-04 22:22:59,747 Example #3\n",
            "2020-02-04 22:22:59,748 \tSource:     what a privilege it is to live in these last days and to be part of jehovah’s incredible organization !\n",
            "2020-02-04 22:22:59,748 \tReference:  tua tokala ku lembalala izuua ioso kuila , ujitu ua dikota ku tokala mu kilunga kia jihova mu ixi , mu izuua isukidila - ku !\n",
            "2020-02-04 22:22:59,748 \tHypothesis: ujitu ua dikota ku kala mu izuua íii isukidila - ku , ni ku bhanga mbandu ku kilunga kia jihova !\n",
            "2020-02-04 22:22:59,748 Validation result (greedy) at epoch  39, step    28000: bleu:  27.94, loss: 35734.3711, ppl:   4.9644, duration: 25.7991s\n",
            "2020-02-04 22:24:09,053 Epoch  39 Step:    28200 Batch Loss:     0.667817 Tokens per Sec:     6729, Lr: 0.000132\n",
            "2020-02-04 22:24:36,539 Epoch  39: total training loss 566.02\n",
            "2020-02-04 22:24:36,539 EPOCH 40\n",
            "2020-02-04 22:25:18,384 Epoch  40 Step:    28400 Batch Loss:     0.907208 Tokens per Sec:     6721, Lr: 0.000131\n",
            "2020-02-04 22:26:27,322 Epoch  40 Step:    28600 Batch Loss:     0.836263 Tokens per Sec:     6749, Lr: 0.000131\n",
            "2020-02-04 22:27:36,618 Epoch  40 Step:    28800 Batch Loss:     0.577048 Tokens per Sec:     6768, Lr: 0.000130\n",
            "2020-02-04 22:28:45,588 Epoch  40 Step:    29000 Batch Loss:     0.725040 Tokens per Sec:     6770, Lr: 0.000130\n",
            "2020-02-04 22:28:46,906 Epoch  40: total training loss 557.25\n",
            "2020-02-04 22:28:46,907 Training ended after  40 epochs.\n",
            "2020-02-04 22:28:46,907 Best validation result (greedy) at step    26000:  28.57 eval_metric.\n",
            "2020-02-04 22:29:27,124  dev bleu:  28.81 [Beam search decoding with beam size = 5 and alpha = 1.0]\n",
            "2020-02-04 22:29:27,125 Translations saved to: models/enkmb_transformer/00026000.hyps.dev\n",
            "2020-02-04 22:31:02,627 test bleu:  32.76 [Beam search decoding with beam size = 5 and alpha = 1.0]\n",
            "2020-02-04 22:31:02,629 Translations saved to: models/enkmb_transformer/00026000.hyps.test\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "EgjqnSBFn2eh",
        "colab_type": "code",
        "outputId": "2d516f75-cac4-43ff-fbc8-27b40f508ea4",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 260
        }
      },
      "source": [
        "! cat joeynmt/models/enkmb_transformer/validations.txt"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Steps: 2000\tLoss: 47097.83984\tPPL: 8.26328\tbleu: 14.85684\tLR: 0.00049411\t*\n",
            "Steps: 4000\tLoss: 39856.20312\tPPL: 5.97219\tbleu: 20.91001\tLR: 0.00034939\t*\n",
            "Steps: 6000\tLoss: 37121.82812\tPPL: 5.28307\tbleu: 23.80892\tLR: 0.00028527\t*\n",
            "Steps: 8000\tLoss: 35674.49609\tPPL: 4.95110\tbleu: 25.00066\tLR: 0.00024705\t*\n",
            "Steps: 10000\tLoss: 34720.73828\tPPL: 4.74383\tbleu: 26.10309\tLR: 0.00022097\t*\n",
            "Steps: 12000\tLoss: 34417.08594\tPPL: 4.67967\tbleu: 26.15923\tLR: 0.00020172\t*\n",
            "Steps: 14000\tLoss: 34284.73047\tPPL: 4.65198\tbleu: 27.20639\tLR: 0.00018675\t*\n",
            "Steps: 16000\tLoss: 34352.42969\tPPL: 4.66613\tbleu: 27.29388\tLR: 0.00017469\t*\n",
            "Steps: 18000\tLoss: 34467.03906\tPPL: 4.69017\tbleu: 27.56135\tLR: 0.00016470\t*\n",
            "Steps: 20000\tLoss: 34512.35938\tPPL: 4.69971\tbleu: 27.82828\tLR: 0.00015625\t*\n",
            "Steps: 22000\tLoss: 34950.40234\tPPL: 4.79293\tbleu: 27.65960\tLR: 0.00014898\t\n",
            "Steps: 24000\tLoss: 35239.23828\tPPL: 4.85541\tbleu: 27.80829\tLR: 0.00014264\t\n",
            "Steps: 26000\tLoss: 35323.68750\tPPL: 4.87383\tbleu: 28.57189\tLR: 0.00013704\t*\n",
            "Steps: 28000\tLoss: 35734.37109\tPPL: 4.96441\tbleu: 27.94465\tLR: 0.00013206\t\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "9NOP9uXVsMDi",
        "colab_type": "code",
        "outputId": "f649ef87-c285-4c4b-b7cb-f44c2251781d",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 69
        }
      },
      "source": [
        "! cd joeynmt; python3 -m joeynmt test models/enkmb_transformer/config.yaml "
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "2020-02-04 22:38:57,549 Hello! This is Joey-NMT.\n",
            "2020-02-04 22:39:38,351  dev bleu:  28.81 [Beam search decoding with beam size = 5 and alpha = 1.0]\n",
            "2020-02-04 22:41:09,183 test bleu:  32.76 [Beam search decoding with beam size = 5 and alpha = 1.0]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "NItstdy5XK8i",
        "colab_type": "code",
        "outputId": "fac74fb9-9561-44a6-8eed-8166ad3d7f42",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 124
        }
      },
      "source": [
        "from google.colab import drive\n",
        "drive.mount('/content/drive')"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Go to this URL in a browser: https://accounts.google.com/o/oauth2/auth?client_id=947318989803-6bn6qk8qdgf4n4g3pfee6491hc0brc4i.apps.googleusercontent.com&redirect_uri=urn%3aietf%3awg%3aoauth%3a2.0%3aoob&response_type=code&scope=email%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdocs.test%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive.photos.readonly%20https%3a%2f%2fwww.googleapis.com%2fauth%2fpeopleapi.readonly\n",
            "\n",
            "Enter your authorization code:\n",
            "··········\n",
            "Mounted at /content/drive\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "0LXxVAOVp9Y8",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "!mkdir -p /content/drive/My\\ Drive/masakhane/kmb"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "8C1ZqQU3XqWM",
        "colab_type": "code",
        "outputId": "590d6509-f9da-406f-ee9b-66c8ac3a1105",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 54
        }
      },
      "source": [
        "!cp -r joeynmt/models/enkmb_transformer /content/drive/My\\ Drive/masakhane/kmb/"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "cp: cannot create symbolic link '/content/drive/My Drive/masakhane/kmb/enkmb_transformer/best.ckpt': Operation not supported\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "nS2NiVSMX5bB",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        ""
      ],
      "execution_count": 0,
      "outputs": []
    }
  ]
}