File size: 595 Bytes
78aa4ee |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
# English to Xitsonga
Author: Laura Martinus
## Data
- The JW300 dataset.
## Model
- Default Masakhane Transformer translation model.
- Link to google drive folder with model(https://drive.google.com/open?id=1onvLxPsRoem2KGnykp2vLEIe4GldDl84)
## Analysis
- TODO
# Results
- BLEU dev: 35.07
- BLEU test: 44.15
- Note: It is probably best to train this model for longer, as it timed out on Google Colab
- Note: Will likely benefit from optimising the number of BPE codes
Ran with BPE set to 40 000 instead of 4 000 and got the following:
- BLEU dev: 39.01
- BLEU test: 46.41 |