{ "nbformat": 4, "nbformat_minor": 0, "metadata": { "accelerator": "GPU", "colab": { "name": "JW300-ts-baseline.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.5.6" } }, "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "Igc5itf-xMGj" }, "source": [ "# Masakhane - Machine Translation for African Languages (Using JoeyNMT)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "x4fXCKCf36IK" }, "source": [ "## Note before beginning:\n", "### - The idea is that you should be able to make minimal changes to this in order to get SOME result for your own translation corpus. \n", "\n", "### - The tl;dr: Go to the **\"TODO\"** comments which will tell you what to update to get up and running\n", "\n", "### - If you actually want to have a clue what you're doing, read the text and peek at the links\n", "\n", "### - With 100 epochs, it should take around 7 hours to run in Google Colab\n", "\n", "### - Once you've gotten a result for your language, please attach and email your notebook that generated it to masakhanetranslation@gmail.com\n", "\n", "### - If you care enough and get a chance, doing a brief background on your language would be amazing. See examples in [(Martinus, 2019)](https://arxiv.org/abs/1906.05685)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "l929HimrxS0a" }, "source": [ "## Retrieve your data & make a parallel corpus\n", "\n", "If you are wanting to use the JW300 data referenced on the Masakhane website or in our GitHub repo, you can use `opus-tools` to convert the data into a convenient format. `opus_read` from that package provides a convenient tool for reading the native aligned XML files and to convert them to TMX format. The tool can also be used to fetch relevant files from OPUS on the fly and to filter the data as necessary. [Read the documentation](https://pypi.org/project/opustools-pkg/) for more details.\n", "\n", "Once you have your corpus files in TMX format (an xml structure which will include the sentences in your target language and your source language in a single file), we recommend reading them into a pandas dataframe. Thankfully, Jade wrote a silly `tmx2dataframe` package which converts your tmx file to a pandas dataframe. " ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "oGRmDELn7Az0", "outputId": "9becebd9-a8e5-4ed9-bd60-529768670e95", "colab": { "base_uri": "https://localhost:8080/", "height": 121 } }, "source": [ "from google.colab import drive\n", "drive.mount('/content/drive')" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Go to this URL in a browser: https://accounts.google.com/o/oauth2/auth?client_id=947318989803-6bn6qk8qdgf4n4g3pfee6491hc0brc4i.apps.googleusercontent.com&redirect_uri=urn%3aietf%3awg%3aoauth%3a2.0%3aoob&response_type=code&scope=email%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdocs.test%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive.photos.readonly%20https%3a%2f%2fwww.googleapis.com%2fauth%2fpeopleapi.readonly\n", "\n", "Enter your authorization code:\n", "··········\n", "Mounted at /content/drive\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "Cn3tgQLzUxwn", "colab": {} }, "source": [ "# TODO: Set your source and target languages. Keep in mind, these traditionally use language codes as found here:\n", "# These will also become the suffix's of all vocab and corpus files used throughout\n", "import os\n", "source_language = \"en\"\n", "target_language = \"ts\" \n", "lc = False # If True, lowercase the data.\n", "seed = 42 # Random seed for shuffling.\n", "tag = \"jw300-baseline\" # Give a unique name to your folder - this is to ensure you don't rewrite any models you've already submitted\n", "\n", "os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n", "os.environ[\"tgt\"] = target_language\n", "os.environ[\"tag\"] = tag\n", "\n", "# This will save it to a folder in our gdrive instead!\n", "!mkdir -p \"/content/drive/My Drive/masakhane/$src-$tgt-$tag\"\n", "os.environ[\"gdrive_path\"] = \"/content/drive/My Drive/masakhane/%s-%s-%s\" % (source_language, target_language, tag)" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "kBSgJHEw7Nvx", "outputId": "0f69c6ad-014a-4f5b-ffe2-2a88f25c177b", "colab": { "base_uri": "https://localhost:8080/", "height": 34 } }, "source": [ "!echo $gdrive_path" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "/content/drive/My Drive/masakhane/en-ts-jw300-baseline\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "gA75Fs9ys8Y9", "outputId": "83cf620f-d42f-4b3e-dd7d-8219a483d99a", "colab": { "base_uri": "https://localhost:8080/", "height": 121 } }, "source": [ "# Install opus-tools\n", "! pip install opustools-pkg" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Collecting opustools-pkg\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/6c/9f/e829a0cceccc603450cd18e1ff80807b6237a88d9a8df2c0bb320796e900/opustools_pkg-0.0.52-py3-none-any.whl (80kB)\n", "\r\u001b[K |████ | 10kB 20.3MB/s eta 0:00:01\r\u001b[K |████████ | 20kB 2.1MB/s eta 0:00:01\r\u001b[K |████████████▏ | 30kB 3.1MB/s eta 0:00:01\r\u001b[K |████████████████▏ | 40kB 2.1MB/s eta 0:00:01\r\u001b[K |████████████████████▎ | 51kB 2.5MB/s eta 0:00:01\r\u001b[K |████████████████████████▎ | 61kB 3.0MB/s eta 0:00:01\r\u001b[K |████████████████████████████▎ | 71kB 3.5MB/s eta 0:00:01\r\u001b[K |████████████████████████████████| 81kB 2.8MB/s \n", "\u001b[?25hInstalling collected packages: opustools-pkg\n", "Successfully installed opustools-pkg-0.0.52\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "xq-tDZVks7ZD", "outputId": "3d8b2378-191f-4d98-f9d5-587f24250603", "colab": { "base_uri": "https://localhost:8080/", "height": 202 } }, "source": [ "# Downloading our corpus\n", "! opus_read -d JW300 -s $src -t $tgt -wm moses -w jw300.$src jw300.$tgt -q\n", "\n", "# extract the corpus file\n", "! gunzip JW300_latest_xml_$src-$tgt.xml.gz" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "\n", "Alignment file /proj/nlpl/data/OPUS/JW300/latest/xml/en-ts.xml.gz not found. The following files are available for downloading:\n", "\n", " 7 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en-ts.xml.gz\n", " 263 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/en.zip\n", " 96 MB https://object.pouta.csc.fi/OPUS-JW300/v1/xml/ts.zip\n", "\n", " 366 MB Total size\n", "./JW300_latest_xml_en-ts.xml.gz ... 100% of 7 MB\n", "./JW300_latest_xml_en.zip ... 100% of 263 MB\n", "./JW300_latest_xml_ts.zip ... 100% of 96 MB\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "n48GDRnP8y2G", "colab_type": "code", "outputId": "c1fdb32a-5520-44e5-ff67-bc1f1331d7c3", "colab": { "base_uri": "https://localhost:8080/", "height": 571 } }, "source": [ "# Download the global test set.\n", "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-any.en\n", " \n", "# And the specific test set for this language pair.\n", "os.environ[\"trg\"] = target_language \n", "os.environ[\"src\"] = source_language \n", "\n", "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-$trg.en \n", "! mv test.en-$trg.en test.en\n", "! wget https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-$trg.$trg \n", "! mv test.en-$trg.$trg test.$trg" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "--2020-01-12 07:56:31-- https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-any.en\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 277791 (271K) [text/plain]\n", "Saving to: ‘test.en-any.en’\n", "\n", "\rtest.en-any.en 0%[ ] 0 --.-KB/s \rtest.en-any.en 100%[===================>] 271.28K --.-KB/s in 0.04s \n", "\n", "2020-01-12 07:56:31 (6.25 MB/s) - ‘test.en-any.en’ saved [277791/277791]\n", "\n", "--2020-01-12 07:56:33-- https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-ts.en\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 205972 (201K) [text/plain]\n", "Saving to: ‘test.en-ts.en’\n", "\n", "test.en-ts.en 100%[===================>] 201.14K --.-KB/s in 0.03s \n", "\n", "2020-01-12 07:56:34 (6.65 MB/s) - ‘test.en-ts.en’ saved [205972/205972]\n", "\n", "--2020-01-12 07:56:38-- https://raw.githubusercontent.com/juliakreutzer/masakhane/master/jw300_utils/test/test.en-ts.ts\n", "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\n", "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 248950 (243K) [text/plain]\n", "Saving to: ‘test.en-ts.ts’\n", "\n", "test.en-ts.ts 100%[===================>] 243.12K --.-KB/s in 0.04s \n", "\n", "2020-01-12 07:56:38 (5.67 MB/s) - ‘test.en-ts.ts’ saved [248950/248950]\n", "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "NqDG-CI28y2L", "colab_type": "code", "outputId": "2a96b899-6a65-4f08-ca7e-9527bfa4af16", "colab": { "base_uri": "https://localhost:8080/", "height": 34 } }, "source": [ "# Read the test data to filter from train and dev splits.\n", "# Store english portion in set for quick filtering checks.\n", "en_test_sents = set()\n", "filter_test_sents = \"test.en-any.en\"\n", "j = 0\n", "with open(filter_test_sents) as f:\n", " for line in f:\n", " en_test_sents.add(line.strip())\n", " j += 1\n", "print('Loaded {} global test sentences to filter from the training/dev data.'.format(j))" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Loaded 3571 global test sentences to filter from the training/dev data.\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "3CNdwLBCfSIl", "outputId": "c3b6f190-0b43-4968-dc1f-6cc99b46b4e5", "colab": { "base_uri": "https://localhost:8080/", "height": 153 } }, "source": [ "import pandas as pd\n", "\n", "# TMX file to dataframe\n", "source_file = 'jw300.' + source_language\n", "target_file = 'jw300.' + target_language\n", "\n", "source = []\n", "target = []\n", "skip_lines = [] # Collect the line numbers of the source portion to skip the same lines for the target portion.\n", "with open(source_file) as f:\n", " for i, line in enumerate(f):\n", " # Skip sentences that are contained in the test set.\n", " if line.strip() not in en_test_sents:\n", " source.append(line.strip())\n", " else:\n", " skip_lines.append(i) \n", "with open(target_file) as f:\n", " for j, line in enumerate(f):\n", " # Only add to corpus if corresponding source was not skipped.\n", " if j not in skip_lines:\n", " target.append(line.strip())\n", " \n", "print('Loaded data and skipped {}/{} lines since contained in test set.'.format(len(skip_lines), i))\n", " \n", "df = pd.DataFrame(zip(source, target), columns=['source_sentence', 'target_sentence'])\n", "# if you get TypeError: data argument can't be an iterator is because of your zip version run this below\n", "#df = pd.DataFrame(list(zip(source, target)), columns=['source_sentence', 'target_sentence'])\n", "df.head(3)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Loaded data and skipped 7170/853024 lines since contained in test set.\n" ], "name": "stdout" }, { "output_type": "execute_result", "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
source_sentencetarget_sentence
0Dowsing ​ — Scientific or Occult ?Ku Femba — I Sayense Kumbe I Vungoma ?
1“ AMAZING ! ”“ WA hlamarisa ! ”
2exclaimed a dairy farmer in the Midwestern Uni...ku huwelela n’wamapursi wa vufuwela - masi le ...
\n", "
" ], "text/plain": [ " source_sentence target_sentence\n", "0 Dowsing ​ — Scientific or Occult ? Ku Femba — I Sayense Kumbe I Vungoma ?\n", "1 “ AMAZING ! ” “ WA hlamarisa ! ”\n", "2 exclaimed a dairy farmer in the Midwestern Uni... ku huwelela n’wamapursi wa vufuwela - masi le ..." ] }, "metadata": { "tags": [] }, "execution_count": 8 } ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "YkuK3B4p2AkN" }, "source": [ "## Pre-processing and export\n", "\n", "It is generally a good idea to remove duplicate translations and conflicting translations from the corpus. In practice, these public corpora include some number of these that need to be cleaned.\n", "\n", "In addition we will split our data into dev/test/train and export to the filesystem." ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "M_2ouEOH1_1q", "outputId": "0bbdf65d-7075-4dc0-9e5e-7ea3c7e0a04f", "colab": { "base_uri": "https://localhost:8080/", "height": 185 } }, "source": [ "# drop duplicate translations\n", "df_pp = df.drop_duplicates()\n", "\n", "# drop conflicting translations\n", "# (this is optional and something that you might want to comment out \n", "# depending on the size of your corpus)\n", "df_pp.drop_duplicates(subset='source_sentence', inplace=True)\n", "df_pp.drop_duplicates(subset='target_sentence', inplace=True)\n", "\n", "# Shuffle the data to remove bias in dev set selection.\n", "df_pp = df_pp.sample(frac=1, random_state=seed).reset_index(drop=True)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:6: SettingWithCopyWarning: \n", "A value is trying to be set on a copy of a slice from a DataFrame\n", "\n", "See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n", " \n", "/usr/local/lib/python3.6/dist-packages/ipykernel_launcher.py:7: SettingWithCopyWarning: \n", "A value is trying to be set on a copy of a slice from a DataFrame\n", "\n", "See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\n", " import sys\n" ], "name": "stderr" } ] }, { "cell_type": "code", "metadata": { "id": "Z_1BwAApEtMk", "colab_type": "code", "outputId": "e8013690-8463-4dea-93cf-abaf9dd268fa", "colab": { "base_uri": "https://localhost:8080/", "height": 1000 } }, "source": [ "# Install fuzzy wuzzy to remove \"almost duplicate\" sentences in the\n", "# test and training sets.\n", "! pip install fuzzywuzzy\n", "! pip install python-Levenshtein\n", "import time\n", "from fuzzywuzzy import process\n", "import numpy as np\n", "\n", "# reset the index of the training set after previous filtering\n", "df_pp.reset_index(drop=False, inplace=True)\n", "\n", "# Remove samples from the training data set if they \"almost overlap\" with the\n", "# samples in the test set.\n", "\n", "# Filtering function. Adjust pad to narrow down the candidate matches to\n", "# within a certain length of characters of the given sample.\n", "def fuzzfilter(sample, candidates, pad):\n", " candidates = [x for x in candidates if len(x) <= len(sample)+pad and len(x) >= len(sample)-pad] \n", " if len(candidates) > 0:\n", " return process.extractOne(sample, candidates)[1]\n", " else:\n", " return np.nan\n", "\n", "# NOTE - This might run slow depending on the size of your training set. We are\n", "# printing some information to help you track how long it would take. \n", "scores = []\n", "start_time = time.time()\n", "for idx, row in df_pp.iterrows():\n", " scores.append(fuzzfilter(row['source_sentence'], list(en_test_sents), 5))\n", " if idx % 1000 == 0:\n", " hours, rem = divmod(time.time() - start_time, 3600)\n", " minutes, seconds = divmod(rem, 60)\n", " print(\"{:0>2}:{:0>2}:{:05.2f}\".format(int(hours),int(minutes),seconds), \"%0.2f percent complete\" % (100.0*float(idx)/float(len(df_pp))))\n", "\n", "# Filter out \"almost overlapping samples\"\n", "df_pp['scores'] = scores\n", "df_pp = df_pp[df_pp['scores'] < 95]" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Collecting fuzzywuzzy\n", " Downloading https://files.pythonhosted.org/packages/d8/f1/5a267addb30ab7eaa1beab2b9323073815da4551076554ecc890a3595ec9/fuzzywuzzy-0.17.0-py2.py3-none-any.whl\n", "Installing collected packages: fuzzywuzzy\n", "Successfully installed fuzzywuzzy-0.17.0\n", "Collecting python-Levenshtein\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/42/a9/d1785c85ebf9b7dfacd08938dd028209c34a0ea3b1bcdb895208bd40a67d/python-Levenshtein-0.12.0.tar.gz (48kB)\n", "\u001b[K |████████████████████████████████| 51kB 2.0MB/s \n", "\u001b[?25hRequirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from python-Levenshtein) (42.0.2)\n", "Building wheels for collected packages: python-Levenshtein\n", " Building wheel for python-Levenshtein (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for python-Levenshtein: filename=python_Levenshtein-0.12.0-cp36-cp36m-linux_x86_64.whl size=144668 sha256=af7227e367da5ab44d12864a032388d1f91ad66813b6324f428ad4d4051d276e\n", " Stored in directory: /root/.cache/pip/wheels/de/c2/93/660fd5f7559049268ad2dc6d81c4e39e9e36518766eaf7e342\n", "Successfully built python-Levenshtein\n", "Installing collected packages: python-Levenshtein\n", "Successfully installed python-Levenshtein-0.12.0\n", "00:00:00.32 0.00 percent complete\n", "00:00:23.27 0.13 percent complete\n", "00:00:47.22 0.26 percent complete\n", "00:01:10.63 0.39 percent complete\n", "00:01:34.28 0.52 percent complete\n", "00:01:57.40 0.65 percent complete\n", "00:02:20.49 0.78 percent complete\n", "00:02:43.69 0.91 percent complete\n", "00:03:07.31 1.04 percent complete\n", "00:03:30.38 1.17 percent complete\n", "00:03:54.32 1.30 percent complete\n", "00:04:17.98 1.43 percent complete\n", "00:04:41.40 1.56 percent complete\n", "00:05:05.13 1.69 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '⇩']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:05:28.89 1.82 percent complete\n", "00:05:52.75 1.95 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '↓ ↓']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:06:15.81 2.07 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '” ?']\n", "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:06:39.30 2.20 percent complete\n", "00:07:01.96 2.33 percent complete\n", "00:07:25.97 2.46 percent complete\n", "00:07:49.50 2.59 percent complete\n", "00:08:12.57 2.72 percent complete\n", "00:08:35.87 2.85 percent complete\n", "00:08:58.55 2.98 percent complete\n", "00:09:21.65 3.11 percent complete\n", "00:09:44.55 3.24 percent complete\n", "00:10:08.15 3.37 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '․ ․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:10:32.04 3.50 percent complete\n", "00:10:54.97 3.63 percent complete\n", "00:11:18.58 3.76 percent complete\n", "00:11:42.29 3.89 percent complete\n", "00:12:05.27 4.02 percent complete\n", "00:12:28.40 4.15 percent complete\n", "00:12:51.70 4.28 percent complete\n", "00:13:14.65 4.41 percent complete\n", "00:13:38.47 4.54 percent complete\n", "00:14:02.18 4.67 percent complete\n", "00:14:26.14 4.80 percent complete\n", "00:14:49.12 4.93 percent complete\n", "00:15:12.18 5.06 percent complete\n", "00:15:35.69 5.19 percent complete\n", "00:15:59.03 5.32 percent complete\n", "00:16:22.66 5.45 percent complete\n", "00:16:46.44 5.58 percent complete\n", "00:17:10.38 5.71 percent complete\n", "00:17:34.42 5.84 percent complete\n", "00:17:57.87 5.96 percent complete\n", "00:18:20.91 6.09 percent complete\n", "00:18:44.01 6.22 percent complete\n", "00:19:07.93 6.35 percent complete\n", "00:19:31.96 6.48 percent complete\n", "00:19:55.89 6.61 percent complete\n", "00:20:18.82 6.74 percent complete\n", "00:20:42.58 6.87 percent complete\n", "00:21:05.97 7.00 percent complete\n", "00:21:29.15 7.13 percent complete\n", "00:21:52.31 7.26 percent complete\n", "00:22:15.38 7.39 percent complete\n", "00:22:38.48 7.52 percent complete\n", "00:23:02.38 7.65 percent complete\n", "00:23:26.71 7.78 percent complete\n", "00:23:50.26 7.91 percent complete\n", "00:24:13.25 8.04 percent complete\n", "00:24:37.17 8.17 percent complete\n", "00:25:00.56 8.30 percent complete\n", "00:25:23.36 8.43 percent complete\n", "00:25:47.23 8.56 percent complete\n", "00:26:09.70 8.69 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '● ● ● ● ●']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:26:32.59 8.82 percent complete\n", "00:26:55.89 8.95 percent complete\n", "00:27:19.61 9.08 percent complete\n", "00:27:43.46 9.21 percent complete\n", "00:28:06.57 9.34 percent complete\n", "00:28:29.87 9.47 percent complete\n", "00:28:53.70 9.60 percent complete\n", "00:29:17.47 9.73 percent complete\n", "00:29:41.28 9.86 percent complete\n", "00:30:04.39 9.98 percent complete\n", "00:30:27.66 10.11 percent complete\n", "00:30:50.92 10.24 percent complete\n", "00:31:14.12 10.37 percent complete\n", "00:31:37.79 10.50 percent complete\n", "00:32:02.09 10.63 percent complete\n", "00:32:25.12 10.76 percent complete\n", "00:32:48.78 10.89 percent complete\n", "00:33:12.62 11.02 percent complete\n", "00:33:35.98 11.15 percent complete\n", "00:33:59.53 11.28 percent complete\n", "00:34:22.76 11.41 percent complete\n", "00:34:46.99 11.54 percent complete\n", "00:35:10.51 11.67 percent complete\n", "00:35:33.30 11.80 percent complete\n", "00:35:57.64 11.93 percent complete\n", "00:36:21.87 12.06 percent complete\n", "00:36:44.76 12.19 percent complete\n", "00:37:07.82 12.32 percent complete\n", "00:37:31.17 12.45 percent complete\n", "00:37:54.03 12.58 percent complete\n", "00:38:17.46 12.71 percent complete\n", "00:38:40.67 12.84 percent complete\n", "00:39:05.24 12.97 percent complete\n", "00:39:28.48 13.10 percent complete\n", "00:39:51.96 13.23 percent complete\n", "00:40:15.08 13.36 percent complete\n", "00:40:38.80 13.49 percent complete\n", "00:41:01.67 13.62 percent complete\n", "00:41:24.41 13.75 percent complete\n", "00:41:47.74 13.87 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '↓ ↓ ↓']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:42:12.33 14.00 percent complete\n", "00:42:35.81 14.13 percent complete\n", "00:42:58.76 14.26 percent complete\n", "00:43:22.76 14.39 percent complete\n", "00:43:46.38 14.52 percent complete\n", "00:44:09.19 14.65 percent complete\n", "00:44:32.07 14.78 percent complete\n", "00:44:55.52 14.91 percent complete\n", "00:45:19.69 15.04 percent complete\n", "00:45:43.69 15.17 percent complete\n", "00:46:06.72 15.30 percent complete\n", "00:46:30.17 15.43 percent complete\n", "00:46:53.59 15.56 percent complete\n", "00:47:16.46 15.69 percent complete\n", "00:47:38.70 15.82 percent complete\n", "00:48:01.78 15.95 percent complete\n", "00:48:25.56 16.08 percent complete\n", "00:48:49.61 16.21 percent complete\n", "00:49:13.07 16.34 percent complete\n", "00:49:36.72 16.47 percent complete\n", "00:50:00.74 16.60 percent complete\n", "00:50:24.49 16.73 percent complete\n", "00:50:48.19 16.86 percent complete\n", "00:51:11.46 16.99 percent complete\n", "00:51:34.58 17.12 percent complete\n", "00:51:58.09 17.25 percent complete\n", "00:52:21.67 17.38 percent complete\n", "00:52:45.16 17.51 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '\\']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:53:08.66 17.64 percent complete\n", "00:53:32.02 17.77 percent complete\n", "00:53:54.52 17.89 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "00:54:17.10 18.02 percent complete\n", "00:54:40.88 18.15 percent complete\n", "00:55:04.48 18.28 percent complete\n", "00:55:27.80 18.41 percent complete\n", "00:55:51.49 18.54 percent complete\n", "00:56:15.10 18.67 percent complete\n", "00:56:38.59 18.80 percent complete\n", "00:57:01.60 18.93 percent complete\n", "00:57:25.04 19.06 percent complete\n", "00:57:48.19 19.19 percent complete\n", "00:58:12.81 19.32 percent complete\n", "00:58:35.58 19.45 percent complete\n", "00:58:58.27 19.58 percent complete\n", "00:59:22.29 19.71 percent complete\n", "00:59:45.35 19.84 percent complete\n", "01:00:09.48 19.97 percent complete\n", "01:00:32.95 20.10 percent complete\n", "01:00:56.31 20.23 percent complete\n", "01:01:20.32 20.36 percent complete\n", "01:01:43.67 20.49 percent complete\n", "01:02:06.95 20.62 percent complete\n", "01:02:30.32 20.75 percent complete\n", "01:02:54.09 20.88 percent complete\n", "01:03:17.37 21.01 percent complete\n", "01:03:40.99 21.14 percent complete\n", "01:04:04.76 21.27 percent complete\n", "01:04:28.55 21.40 percent complete\n", "01:04:51.83 21.53 percent complete\n", "01:05:14.50 21.66 percent complete\n", "01:05:37.62 21.79 percent complete\n", "01:06:01.07 21.91 percent complete\n", "01:06:24.35 22.04 percent complete\n", "01:06:47.90 22.17 percent complete\n", "01:07:11.09 22.30 percent complete\n", "01:07:34.62 22.43 percent complete\n", "01:07:58.07 22.56 percent complete\n", "01:08:21.46 22.69 percent complete\n", "01:08:44.94 22.82 percent complete\n", "01:09:08.03 22.95 percent complete\n", "01:09:30.89 23.08 percent complete\n", "01:09:54.46 23.21 percent complete\n", "01:10:17.96 23.34 percent complete\n", "01:10:42.33 23.47 percent complete\n", "01:11:05.77 23.60 percent complete\n", "01:11:28.85 23.73 percent complete\n", "01:11:52.13 23.86 percent complete\n", "01:12:15.89 23.99 percent complete\n", "01:12:39.60 24.12 percent complete\n", "01:13:02.46 24.25 percent complete\n", "01:13:26.08 24.38 percent complete\n", "01:13:49.98 24.51 percent complete\n", "01:14:12.70 24.64 percent complete\n", "01:14:36.55 24.77 percent complete\n", "01:14:59.70 24.90 percent complete\n", "01:15:23.29 25.03 percent complete\n", "01:15:46.73 25.16 percent complete\n", "01:16:09.70 25.29 percent complete\n", "01:16:33.82 25.42 percent complete\n", "01:16:57.49 25.55 percent complete\n", "01:17:21.26 25.68 percent complete\n", "01:17:45.01 25.80 percent complete\n", "01:18:09.40 25.93 percent complete\n", "01:18:33.64 26.06 percent complete\n", "01:18:56.90 26.19 percent complete\n", "01:19:20.35 26.32 percent complete\n", "01:19:43.46 26.45 percent complete\n", "01:20:07.22 26.58 percent complete\n", "01:20:30.63 26.71 percent complete\n", "01:20:54.11 26.84 percent complete\n", "01:21:18.01 26.97 percent complete\n", "01:21:41.50 27.10 percent complete\n", "01:22:05.34 27.23 percent complete\n", "01:22:28.66 27.36 percent complete\n", "01:22:52.21 27.49 percent complete\n", "01:23:16.32 27.62 percent complete\n", "01:23:40.06 27.75 percent complete\n", "01:24:02.80 27.88 percent complete\n", "01:24:26.17 28.01 percent complete\n", "01:24:49.25 28.14 percent complete\n", "01:25:13.58 28.27 percent complete\n", "01:25:37.14 28.40 percent complete\n", "01:26:01.73 28.53 percent complete\n", "01:26:25.07 28.66 percent complete\n", "01:26:48.38 28.79 percent complete\n", "01:27:11.44 28.92 percent complete\n", "01:27:35.48 29.05 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '$ ․ ․ ․ ․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "01:27:59.22 29.18 percent complete\n", "01:28:22.67 29.31 percent complete\n", "01:28:45.72 29.44 percent complete\n", "01:29:08.77 29.57 percent complete\n", "01:29:32.74 29.70 percent complete\n", "01:29:56.52 29.82 percent complete\n", "01:30:19.02 29.95 percent complete\n", "01:30:42.54 30.08 percent complete\n", "01:31:06.05 30.21 percent complete\n", "01:31:29.58 30.34 percent complete\n", "01:31:53.22 30.47 percent complete\n", "01:32:16.17 30.60 percent complete\n", "01:32:39.78 30.73 percent complete\n", "01:33:04.06 30.86 percent complete\n", "01:33:28.07 30.99 percent complete\n", "01:33:51.68 31.12 percent complete\n", "01:34:14.99 31.25 percent complete\n", "01:34:38.36 31.38 percent complete\n", "01:35:01.89 31.51 percent complete\n", "01:35:25.19 31.64 percent complete\n", "01:35:49.12 31.77 percent complete\n", "01:36:13.00 31.90 percent complete\n", "01:36:36.13 32.03 percent complete\n", "01:36:59.97 32.16 percent complete\n", "01:37:23.62 32.29 percent complete\n", "01:37:46.95 32.42 percent complete\n", "01:38:10.53 32.55 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '← ←']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "01:38:33.86 32.68 percent complete\n", "01:38:57.57 32.81 percent complete\n", "01:39:21.43 32.94 percent complete\n", "01:39:44.95 33.07 percent complete\n", "01:40:08.20 33.20 percent complete\n", "01:40:31.16 33.33 percent complete\n", "01:40:54.95 33.46 percent complete\n", "01:41:17.60 33.59 percent complete\n", "01:41:41.20 33.71 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '←']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "01:42:04.26 33.84 percent complete\n", "01:42:28.08 33.97 percent complete\n", "01:42:52.83 34.10 percent complete\n", "01:43:16.09 34.23 percent complete\n", "01:43:39.34 34.36 percent complete\n", "01:44:02.64 34.49 percent complete\n", "01:44:25.72 34.62 percent complete\n", "01:44:49.04 34.75 percent complete\n", "01:45:12.51 34.88 percent complete\n", "01:45:36.36 35.01 percent complete\n", "01:46:00.02 35.14 percent complete\n", "01:46:23.82 35.27 percent complete\n", "01:46:47.74 35.40 percent complete\n", "01:47:11.22 35.53 percent complete\n", "01:47:34.34 35.66 percent complete\n", "01:47:58.52 35.79 percent complete\n", "01:48:22.26 35.92 percent complete\n", "01:48:46.15 36.05 percent complete\n", "01:49:10.20 36.18 percent complete\n", "01:49:33.35 36.31 percent complete\n", "01:49:56.08 36.44 percent complete\n", "01:50:19.53 36.57 percent complete\n", "01:50:43.55 36.70 percent complete\n", "01:51:07.21 36.83 percent complete\n", "01:51:30.03 36.96 percent complete\n", "01:51:53.20 37.09 percent complete\n", "01:52:16.82 37.22 percent complete\n", "01:52:40.07 37.35 percent complete\n", "01:53:04.17 37.48 percent complete\n", "01:53:27.98 37.61 percent complete\n", "01:53:51.20 37.73 percent complete\n", "01:54:14.92 37.86 percent complete\n", "01:54:38.18 37.99 percent complete\n", "01:55:01.88 38.12 percent complete\n", "01:55:24.73 38.25 percent complete\n", "01:55:47.95 38.38 percent complete\n", "01:56:11.21 38.51 percent complete\n", "01:56:34.88 38.64 percent complete\n", "01:56:58.38 38.77 percent complete\n", "01:57:22.38 38.90 percent complete\n", "01:57:45.66 39.03 percent complete\n", "01:58:09.17 39.16 percent complete\n", "01:58:32.28 39.29 percent complete\n", "01:58:55.65 39.42 percent complete\n", "01:59:18.74 39.55 percent complete\n", "01:59:41.45 39.68 percent complete\n", "02:00:05.19 39.81 percent complete\n", "02:00:28.34 39.94 percent complete\n", "02:00:51.37 40.07 percent complete\n", "02:01:15.11 40.20 percent complete\n", "02:01:39.20 40.33 percent complete\n", "02:02:02.68 40.46 percent complete\n", "02:02:26.52 40.59 percent complete\n", "02:02:50.36 40.72 percent complete\n", "02:03:13.67 40.85 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '⇧']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:03:37.14 40.98 percent complete\n", "02:04:01.15 41.11 percent complete\n", "02:04:24.42 41.24 percent complete\n", "02:04:47.28 41.37 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '”']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:05:11.21 41.50 percent complete\n", "02:05:34.65 41.62 percent complete\n", "02:05:58.07 41.75 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '→ →']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:06:20.81 41.88 percent complete\n", "02:06:43.87 42.01 percent complete\n", "02:07:06.75 42.14 percent complete\n", "02:07:30.31 42.27 percent complete\n", "02:07:54.51 42.40 percent complete\n", "02:08:17.13 42.53 percent complete\n", "02:08:40.93 42.66 percent complete\n", "02:09:04.12 42.79 percent complete\n", "02:09:27.64 42.92 percent complete\n", "02:09:51.82 43.05 percent complete\n", "02:10:14.97 43.18 percent complete\n", "02:10:38.30 43.31 percent complete\n", "02:11:02.34 43.44 percent complete\n", "02:11:25.55 43.57 percent complete\n", "02:11:48.65 43.70 percent complete\n", "02:12:12.30 43.83 percent complete\n", "02:12:36.26 43.96 percent complete\n", "02:13:00.11 44.09 percent complete\n", "02:13:24.01 44.22 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '← ◯']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:13:47.61 44.35 percent complete\n", "02:14:10.69 44.48 percent complete\n", "02:14:33.71 44.61 percent complete\n", "02:14:57.77 44.74 percent complete\n", "02:15:21.05 44.87 percent complete\n", "02:15:44.11 45.00 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '↓']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:16:06.75 45.13 percent complete\n", "02:16:29.93 45.26 percent complete\n", "02:16:53.70 45.39 percent complete\n", "02:17:17.50 45.52 percent complete\n", "02:17:40.39 45.64 percent complete\n", "02:18:04.09 45.77 percent complete\n", "02:18:27.98 45.90 percent complete\n", "02:18:51.74 46.03 percent complete\n", "02:19:15.54 46.16 percent complete\n", "02:19:38.51 46.29 percent complete\n", "02:20:01.79 46.42 percent complete\n", "02:20:25.93 46.55 percent complete\n", "02:20:49.35 46.68 percent complete\n", "02:21:12.69 46.81 percent complete\n", "02:21:35.86 46.94 percent complete\n", "02:21:59.20 47.07 percent complete\n", "02:22:21.85 47.20 percent complete\n", "02:22:45.54 47.33 percent complete\n", "02:23:08.83 47.46 percent complete\n", "02:23:32.52 47.59 percent complete\n", "02:23:56.35 47.72 percent complete\n", "02:24:19.47 47.85 percent complete\n", "02:24:43.28 47.98 percent complete\n", "02:25:06.42 48.11 percent complete\n", "02:25:29.32 48.24 percent complete\n", "02:25:53.31 48.37 percent complete\n", "02:26:16.25 48.50 percent complete\n", "02:26:39.95 48.63 percent complete\n", "02:27:04.11 48.76 percent complete\n", "02:27:27.13 48.89 percent complete\n", "02:27:50.91 49.02 percent complete\n", "02:28:14.18 49.15 percent complete\n", "02:28:37.01 49.28 percent complete\n", "02:29:00.30 49.41 percent complete\n", "02:29:24.20 49.53 percent complete\n", "02:29:46.94 49.66 percent complete\n", "02:30:11.09 49.79 percent complete\n", "02:30:33.97 49.92 percent complete\n", "02:30:57.53 50.05 percent complete\n", "02:31:20.48 50.18 percent complete\n", "02:31:44.31 50.31 percent complete\n", "02:32:07.92 50.44 percent complete\n", "02:32:31.46 50.57 percent complete\n", "02:32:54.73 50.70 percent complete\n", "02:33:17.96 50.83 percent complete\n", "02:33:41.31 50.96 percent complete\n", "02:34:04.79 51.09 percent complete\n", "02:34:29.16 51.22 percent complete\n", "02:34:52.32 51.35 percent complete\n", "02:35:16.05 51.48 percent complete\n", "02:35:39.28 51.61 percent complete\n", "02:36:02.55 51.74 percent complete\n", "02:36:26.53 51.87 percent complete\n", "02:36:49.48 52.00 percent complete\n", "02:37:12.98 52.13 percent complete\n", "02:37:35.96 52.26 percent complete\n", "02:37:59.19 52.39 percent complete\n", "02:38:23.11 52.52 percent complete\n", "02:38:45.32 52.65 percent complete\n", "02:39:08.52 52.78 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '→']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:39:31.99 52.91 percent complete\n", "02:39:54.82 53.04 percent complete\n", "02:40:18.96 53.17 percent complete\n", "02:40:42.65 53.30 percent complete\n", "02:41:05.67 53.43 percent complete\n", "02:41:28.91 53.55 percent complete\n", "02:41:52.02 53.68 percent complete\n", "02:42:15.44 53.81 percent complete\n", "02:42:39.13 53.94 percent complete\n", "02:43:01.89 54.07 percent complete\n", "02:43:24.84 54.20 percent complete\n", "02:43:48.93 54.33 percent complete\n", "02:44:12.24 54.46 percent complete\n", "02:44:35.08 54.59 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '*']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:44:57.92 54.72 percent complete\n", "02:45:21.08 54.85 percent complete\n", "02:45:44.39 54.98 percent complete\n", "02:46:07.82 55.11 percent complete\n", "02:46:31.62 55.24 percent complete\n", "02:46:54.89 55.37 percent complete\n", "02:47:18.52 55.50 percent complete\n", "02:47:41.62 55.63 percent complete\n", "02:48:05.57 55.76 percent complete\n", "02:48:28.83 55.89 percent complete\n", "02:48:52.89 56.02 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '— ― ― ― ― ― ― ―']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "02:49:16.39 56.15 percent complete\n", "02:49:39.09 56.28 percent complete\n", "02:50:02.08 56.41 percent complete\n", "02:50:25.13 56.54 percent complete\n", "02:50:48.62 56.67 percent complete\n", "02:51:12.48 56.80 percent complete\n", "02:51:35.99 56.93 percent complete\n", "02:51:59.90 57.06 percent complete\n", "02:52:22.93 57.19 percent complete\n", "02:52:45.54 57.32 percent complete\n", "02:53:08.76 57.45 percent complete\n", "02:53:31.74 57.57 percent complete\n", "02:53:55.27 57.70 percent complete\n", "02:54:19.18 57.83 percent complete\n", "02:54:42.23 57.96 percent complete\n", "02:55:06.15 58.09 percent complete\n", "02:55:29.35 58.22 percent complete\n", "02:55:53.00 58.35 percent complete\n", "02:56:16.96 58.48 percent complete\n", "02:56:40.81 58.61 percent complete\n", "02:57:04.20 58.74 percent complete\n", "02:57:28.12 58.87 percent complete\n", "02:57:51.65 59.00 percent complete\n", "02:58:16.27 59.13 percent complete\n", "02:58:40.01 59.26 percent complete\n", "02:59:03.17 59.39 percent complete\n", "02:59:26.82 59.52 percent complete\n", "02:59:51.13 59.65 percent complete\n", "03:00:15.15 59.78 percent complete\n", "03:00:38.36 59.91 percent complete\n", "03:01:02.46 60.04 percent complete\n", "03:01:26.38 60.17 percent complete\n", "03:01:49.60 60.30 percent complete\n", "03:02:12.54 60.43 percent complete\n", "03:02:36.06 60.56 percent complete\n", "03:02:59.89 60.69 percent complete\n", "03:03:22.58 60.82 percent complete\n", "03:03:45.76 60.95 percent complete\n", "03:04:08.89 61.08 percent complete\n", "03:04:32.17 61.21 percent complete\n", "03:04:55.88 61.34 percent complete\n", "03:05:19.45 61.46 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '” *']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "03:05:42.51 61.59 percent complete\n", "03:06:06.06 61.72 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '→ → →']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "03:06:30.03 61.85 percent complete\n", "03:06:53.17 61.98 percent complete\n", "03:07:16.77 62.11 percent complete\n", "03:07:39.96 62.24 percent complete\n", "03:08:03.71 62.37 percent complete\n", "03:08:27.18 62.50 percent complete\n", "03:08:51.24 62.63 percent complete\n", "03:09:15.72 62.76 percent complete\n", "03:09:39.74 62.89 percent complete\n", "03:10:03.57 63.02 percent complete\n", "03:10:27.55 63.15 percent complete\n", "03:10:51.54 63.28 percent complete\n", "03:11:15.46 63.41 percent complete\n", "03:11:38.99 63.54 percent complete\n", "03:12:03.00 63.67 percent complete\n", "03:12:26.51 63.80 percent complete\n", "03:12:50.06 63.93 percent complete\n", "03:13:13.18 64.06 percent complete\n", "03:13:36.04 64.19 percent complete\n", "03:14:00.05 64.32 percent complete\n", "03:14:23.54 64.45 percent complete\n", "03:14:47.26 64.58 percent complete\n", "03:15:11.30 64.71 percent complete\n", "03:15:35.74 64.84 percent complete\n", "03:15:59.31 64.97 percent complete\n", "03:16:22.09 65.10 percent complete\n", "03:16:45.65 65.23 percent complete\n", "03:17:09.90 65.36 percent complete\n", "03:17:33.65 65.48 percent complete\n", "03:17:57.35 65.61 percent complete\n", "03:18:20.01 65.74 percent complete\n", "03:18:43.24 65.87 percent complete\n", "03:19:06.64 66.00 percent complete\n", "03:19:29.41 66.13 percent complete\n", "03:19:52.58 66.26 percent complete\n", "03:20:16.09 66.39 percent complete\n", "03:20:40.42 66.52 percent complete\n", "03:21:04.19 66.65 percent complete\n", "03:21:26.89 66.78 percent complete\n", "03:21:51.04 66.91 percent complete\n", "03:22:14.29 67.04 percent complete\n", "03:22:37.75 67.17 percent complete\n", "03:23:01.22 67.30 percent complete\n", "03:23:24.79 67.43 percent complete\n", "03:23:49.01 67.56 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '●']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "03:24:12.36 67.69 percent complete\n", "03:24:35.82 67.82 percent complete\n", "03:24:58.35 67.95 percent complete\n", "03:25:21.28 68.08 percent complete\n", "03:25:44.63 68.21 percent complete\n", "03:26:07.46 68.34 percent complete\n", "03:26:31.01 68.47 percent complete\n", "03:26:55.16 68.60 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '․ ․ ․ ․ ․ ․ ․ ․ ․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "03:27:17.70 68.73 percent complete\n", "03:27:42.07 68.86 percent complete\n", "03:28:04.97 68.99 percent complete\n", "03:28:28.81 69.12 percent complete\n", "03:28:53.10 69.25 percent complete\n", "03:29:16.01 69.37 percent complete\n", "03:29:39.26 69.50 percent complete\n", "03:30:03.12 69.63 percent complete\n", "03:30:26.93 69.76 percent complete\n", "03:30:49.80 69.89 percent complete\n", "03:31:13.09 70.02 percent complete\n", "03:31:36.96 70.15 percent complete\n", "03:32:00.35 70.28 percent complete\n", "03:32:23.70 70.41 percent complete\n", "03:32:46.42 70.54 percent complete\n", "03:33:09.38 70.67 percent complete\n", "03:33:33.23 70.80 percent complete\n", "03:33:56.70 70.93 percent complete\n", "03:34:18.89 71.06 percent complete\n", "03:34:42.15 71.19 percent complete\n", "03:35:05.21 71.32 percent complete\n", "03:35:28.70 71.45 percent complete\n", "03:35:52.23 71.58 percent complete\n", "03:36:15.44 71.71 percent complete\n", "03:36:39.03 71.84 percent complete\n", "03:37:02.63 71.97 percent complete\n", "03:37:25.39 72.10 percent complete\n", "03:37:48.81 72.23 percent complete\n", "03:38:12.34 72.36 percent complete\n", "03:38:35.76 72.49 percent complete\n", "03:38:59.76 72.62 percent complete\n", "03:39:23.60 72.75 percent complete\n", "03:39:46.27 72.88 percent complete\n", "03:40:09.62 73.01 percent complete\n", "03:40:33.19 73.14 percent complete\n", "03:40:55.66 73.27 percent complete\n", "03:41:19.22 73.39 percent complete\n", "03:41:42.24 73.52 percent complete\n", "03:42:06.15 73.65 percent complete\n", "03:42:29.33 73.78 percent complete\n", "03:42:52.76 73.91 percent complete\n", "03:43:15.70 74.04 percent complete\n", "03:43:39.02 74.17 percent complete\n", "03:44:02.83 74.30 percent complete\n", "03:44:26.18 74.43 percent complete\n", "03:44:49.62 74.56 percent complete\n", "03:45:13.14 74.69 percent complete\n", "03:45:35.82 74.82 percent complete\n", "03:46:00.23 74.95 percent complete\n", "03:46:23.65 75.08 percent complete\n", "03:46:47.37 75.21 percent complete\n", "03:47:11.43 75.34 percent complete\n", "03:47:34.73 75.47 percent complete\n", "03:47:58.16 75.60 percent complete\n", "03:48:21.13 75.73 percent complete\n", "03:48:44.19 75.86 percent complete\n", "03:49:07.36 75.99 percent complete\n", "03:49:30.90 76.12 percent complete\n", "03:49:53.96 76.25 percent complete\n", "03:50:16.57 76.38 percent complete\n", "03:50:39.81 76.51 percent complete\n", "03:51:02.79 76.64 percent complete\n", "03:51:26.07 76.77 percent complete\n", "03:51:49.06 76.90 percent complete\n", "03:52:11.95 77.03 percent complete\n", "03:52:34.97 77.16 percent complete\n", "03:52:58.27 77.28 percent complete\n", "03:53:21.78 77.41 percent complete\n", "03:53:44.93 77.54 percent complete\n", "03:54:08.74 77.67 percent complete\n", "03:54:32.57 77.80 percent complete\n", "03:54:55.99 77.93 percent complete\n", "03:55:19.68 78.06 percent complete\n", "03:55:43.23 78.19 percent complete\n", "03:56:05.88 78.32 percent complete\n", "03:56:28.95 78.45 percent complete\n", "03:56:53.17 78.58 percent complete\n", "03:57:16.45 78.71 percent complete\n", "03:57:40.12 78.84 percent complete\n", "03:58:03.49 78.97 percent complete\n", "03:58:27.34 79.10 percent complete\n", "03:58:50.64 79.23 percent complete\n", "03:59:14.14 79.36 percent complete\n", "03:59:38.16 79.49 percent complete\n", "04:00:01.70 79.62 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '□ ․ ․ ․ ․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:00:24.76 79.75 percent complete\n", "04:00:47.96 79.88 percent complete\n", "04:01:11.12 80.01 percent complete\n", "04:01:35.08 80.14 percent complete\n", "04:01:58.87 80.27 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '* * *']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:02:22.85 80.40 percent complete\n", "04:02:46.41 80.53 percent complete\n", "04:03:10.11 80.66 percent complete\n", "04:03:33.17 80.79 percent complete\n", "04:03:56.22 80.92 percent complete\n", "04:04:20.04 81.05 percent complete\n", "04:04:43.35 81.18 percent complete\n", "04:05:06.63 81.30 percent complete\n", "04:05:30.59 81.43 percent complete\n", "04:05:54.33 81.56 percent complete\n", "04:06:18.25 81.69 percent complete\n", "04:06:41.57 81.82 percent complete\n", "04:07:05.21 81.95 percent complete\n", "04:07:28.21 82.08 percent complete\n", "04:07:51.50 82.21 percent complete\n", "04:08:15.68 82.34 percent complete\n", "04:08:39.56 82.47 percent complete\n", "04:09:03.76 82.60 percent complete\n", "04:09:27.05 82.73 percent complete\n", "04:09:50.87 82.86 percent complete\n", "04:10:13.78 82.99 percent complete\n", "04:10:36.84 83.12 percent complete\n", "04:11:00.06 83.25 percent complete\n", "04:11:23.40 83.38 percent complete\n", "04:11:47.06 83.51 percent complete\n", "04:12:10.25 83.64 percent complete\n", "04:12:33.95 83.77 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:12:57.13 83.90 percent complete\n", "04:13:20.42 84.03 percent complete\n", "04:13:43.88 84.16 percent complete\n", "04:14:07.37 84.29 percent complete\n", "04:14:30.72 84.42 percent complete\n", "04:14:54.55 84.55 percent complete\n", "04:15:17.53 84.68 percent complete\n", "04:15:41.40 84.81 percent complete\n", "04:16:04.36 84.94 percent complete\n", "04:16:27.64 85.07 percent complete\n", "04:16:51.35 85.20 percent complete\n", "04:17:14.76 85.32 percent complete\n", "04:17:38.29 85.45 percent complete\n", "04:18:02.26 85.58 percent complete\n", "04:18:26.00 85.71 percent complete\n", "04:18:49.52 85.84 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '. .']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:19:13.21 85.97 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '$ $ $']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:19:36.56 86.10 percent complete\n", "04:19:59.76 86.23 percent complete\n", "04:20:23.04 86.36 percent complete\n", "04:20:47.07 86.49 percent complete\n", "04:21:10.30 86.62 percent complete\n", "04:21:33.90 86.75 percent complete\n", "04:21:57.86 86.88 percent complete\n", "04:22:21.61 87.01 percent complete\n", "04:22:44.91 87.14 percent complete\n", "04:23:07.81 87.27 percent complete\n", "04:23:31.42 87.40 percent complete\n", "04:23:55.14 87.53 percent complete\n", "04:24:18.65 87.66 percent complete\n", "04:24:41.65 87.79 percent complete\n", "04:25:05.42 87.92 percent complete\n", "04:25:28.31 88.05 percent complete\n", "04:25:52.43 88.18 percent complete\n", "04:26:16.07 88.31 percent complete\n", "04:26:39.40 88.44 percent complete\n", "04:27:03.59 88.57 percent complete\n", "04:27:27.28 88.70 percent complete\n", "04:27:51.04 88.83 percent complete\n", "04:28:14.16 88.96 percent complete\n", "04:28:37.25 89.09 percent complete\n", "04:29:00.93 89.21 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '“ . . .']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:29:24.66 89.34 percent complete\n", "04:29:48.37 89.47 percent complete\n", "04:30:11.23 89.60 percent complete\n", "04:30:34.72 89.73 percent complete\n", "04:30:58.39 89.86 percent complete\n", "04:31:20.75 89.99 percent complete\n", "04:31:44.40 90.12 percent complete\n", "04:32:07.43 90.25 percent complete\n", "04:32:30.81 90.38 percent complete\n", "04:32:53.90 90.51 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '↓ ↓ ↓ ↓']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:33:17.16 90.64 percent complete\n", "04:33:40.93 90.77 percent complete\n", "04:34:04.58 90.90 percent complete\n", "04:34:28.06 91.03 percent complete\n", "04:34:51.43 91.16 percent complete\n", "04:35:15.71 91.29 percent complete\n", "04:35:40.02 91.42 percent complete\n", "04:36:03.36 91.55 percent complete\n", "04:36:27.82 91.68 percent complete\n", "04:36:52.06 91.81 percent complete\n", "04:37:14.68 91.94 percent complete\n", "04:37:38.26 92.07 percent complete\n", "04:38:01.50 92.20 percent complete\n", "04:38:24.92 92.33 percent complete\n", "04:38:48.35 92.46 percent complete\n", "04:39:10.99 92.59 percent complete\n", "04:39:35.08 92.72 percent complete\n", "04:39:58.58 92.85 percent complete\n", "04:40:22.30 92.98 percent complete\n", "04:40:45.63 93.11 percent complete\n", "04:41:09.35 93.23 percent complete\n", "04:41:33.12 93.36 percent complete\n", "04:41:56.72 93.49 percent complete\n", "04:42:19.88 93.62 percent complete\n", "04:42:43.58 93.75 percent complete\n", "04:43:08.11 93.88 percent complete\n", "04:43:31.80 94.01 percent complete\n", "04:43:55.84 94.14 percent complete\n", "04:44:18.50 94.27 percent complete\n", "04:44:41.49 94.40 percent complete\n", "04:45:04.05 94.53 percent complete\n", "04:45:27.53 94.66 percent complete\n", "04:45:50.96 94.79 percent complete\n", "04:46:15.16 94.92 percent complete\n", "04:46:38.85 95.05 percent complete\n", "04:47:02.49 95.18 percent complete\n", "04:47:26.01 95.31 percent complete\n", "04:47:48.73 95.44 percent complete\n", "04:48:11.97 95.57 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '․ ․']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:48:35.07 95.70 percent complete\n", "04:48:58.40 95.83 percent complete\n", "04:49:22.59 95.96 percent complete\n", "04:49:45.14 96.09 percent complete\n", "04:50:08.70 96.22 percent complete\n", "04:50:31.72 96.35 percent complete\n", "04:50:55.85 96.48 percent complete\n", "04:51:18.94 96.61 percent complete\n", "04:51:42.51 96.74 percent complete\n", "04:52:05.43 96.87 percent complete\n", "04:52:29.10 97.00 percent complete\n", "04:52:52.33 97.12 percent complete\n", "04:53:15.84 97.25 percent complete\n", "04:53:39.87 97.38 percent complete\n", "04:54:03.54 97.51 percent complete\n", "04:54:26.86 97.64 percent complete\n", "04:54:50.21 97.77 percent complete\n", "04:55:13.23 97.90 percent complete\n", "04:55:36.53 98.03 percent complete\n", "04:56:00.25 98.16 percent complete\n", "04:56:23.56 98.29 percent complete\n", "04:56:47.46 98.42 percent complete\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "WARNING:root:Applied processor reduces input query to empty string, all comparisons will have score 0. [Query: '← ← ← ← ← ← ← ← ← ←']\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "04:57:10.69 98.55 percent complete\n", "04:57:34.57 98.68 percent complete\n", "04:57:58.16 98.81 percent complete\n", "04:58:22.14 98.94 percent complete\n", "04:58:46.25 99.07 percent complete\n", "04:59:09.46 99.20 percent complete\n", "04:59:33.09 99.33 percent complete\n", "04:59:56.97 99.46 percent complete\n", "05:00:20.35 99.59 percent complete\n", "05:00:43.63 99.72 percent complete\n", "05:01:07.55 99.85 percent complete\n", "05:01:30.90 99.98 percent complete\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "hxxBOCA-xXhy", "outputId": "1f179866-8cab-4d27-dc24-77a8167047f9", "colab": { "base_uri": "https://localhost:8080/", "height": 810 } }, "source": [ "# This section does the split between train/dev for the parallel corpora then saves them as separate files\n", "# We use 1000 dev test and the given test set.\n", "import csv\n", "\n", "# Do the split between dev/train and create parallel corpora\n", "num_dev_patterns = 1000\n", "\n", "# Optional: lower case the corpora - this will make it easier to generalize, but without proper casing.\n", "if lc: # Julia: making lowercasing optional\n", " df_pp[\"source_sentence\"] = df_pp[\"source_sentence\"].str.lower()\n", " df_pp[\"target_sentence\"] = df_pp[\"target_sentence\"].str.lower()\n", "\n", "# Julia: test sets are already generated\n", "dev = df_pp.tail(num_dev_patterns) # Herman: Error in original\n", "stripped = df_pp.drop(df_pp.tail(num_dev_patterns).index)\n", "\n", "with open(\"train.\"+source_language, \"w\") as src_file, open(\"train.\"+target_language, \"w\") as trg_file:\n", " for index, row in stripped.iterrows():\n", " src_file.write(row[\"source_sentence\"]+\"\\n\")\n", " trg_file.write(row[\"target_sentence\"]+\"\\n\")\n", " \n", "with open(\"dev.\"+source_language, \"w\") as src_file, open(\"dev.\"+target_language, \"w\") as trg_file:\n", " for index, row in dev.iterrows():\n", " src_file.write(row[\"source_sentence\"]+\"\\n\")\n", " trg_file.write(row[\"target_sentence\"]+\"\\n\")\n", "\n", "#stripped[[\"source_sentence\"]].to_csv(\"train.\"+source_language, header=False, index=False) # Herman: Added `header=False` everywhere\n", "#stripped[[\"target_sentence\"]].to_csv(\"train.\"+target_language, header=False, index=False) # Julia: Problematic handling of quotation marks.\n", "\n", "#dev[[\"source_sentence\"]].to_csv(\"dev.\"+source_language, header=False, index=False)\n", "#dev[[\"target_sentence\"]].to_csv(\"dev.\"+target_language, header=False, index=False)\n", "\n", "# Doublecheck the format below. There should be no extra quotation marks or weird characters.\n", "! head train.*\n", "! head dev.*" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "==> train.en <==\n", "However , the real cause of their distress went far deeper .\n", "If they dig down into Sheol , from there my own hand will take them ; and if they go up to the heavens , from there I shall bring them down . ” ​ — Amos 9 : 1 , 2 .\n", "Through his sacred Word , the Holy Bible , God has made clear what his will is .\n", "We remembered Jesus ’ admonition to his disciples that they should “ keep bearing much fruit . ”\n", "The lower back dial predicted solar and lunar eclipses\n", "Most of the time , I traveled from one congregation to the next by bus or by train .\n", "The challenge is to be happy as well as organized in all that we do .\n", "Some spouses are holding down two jobs just to get by , and in other cases both spouses are working , leaving the children with grandparents or at a child - care center .\n", "We can see that from the resurrections Jesus performed when outside of Nain and when in the home of Jairus .\n", "Another way we can honor Jehovah God is by making monetary contributions to the worldwide preaching work that he has authorized .\n", "\n", "==> train.ts <==\n", "Hambi swi ri tano , xivangelo xa ntiyiso xa gome ra vona xi ye xi nyanya .\n", "Loko va cela eSheol , voko ra mina ri ta va teka kona ; loko va tlhandlukela ematilweni , ndzi ta va xikisa kona . ” — Amosi 9 : 1 , 2 .\n", "Hi ku tirhisa Rito ra xona leri xiximekaka , Bibele yo Kwetsima , Xikwembu xi swi veke erivaleni leswaku ku rhandza ka xona hi kwihi .\n", "Hi tsundzuke xikhutazo lexi Yesu a xi nyikeke vadyondzisiwa va yena leswaku va fanele va ‘ hambeta va tswala mihandzu yo tala . ’\n", "Ximhandzana xa le ndzhaku lexi nga le hansi a xi vhumbha ku sirhiwa ka dyambu hi n’weti ni loko misava yi ri exikarhi ka dyambu ni n’weti\n", "Minkarhi yo tala loko ndzi endzela mavandlha a ndzi famba hi bazi kumbe hi xitimela .\n", "Mphikamakaneta i ku va la tsakeke ni ku hleleka eka hinkwaswo leswi hi swi endlaka .\n", "Mimpatswa yin’wana yi tirha mintirho yimbirhi leswaku yi kota ku hakelela swilaveko swa xisekelo naswona minkarhi yin’wana vatekani havambirhi va tirha kutani va siya vana va vona ni vakokwa wa vona kumbe ekhirexe .\n", "Hi vona sweswo loko Yesu a pfuxe van’wana ehandle ka muti wa Nayini ni loko a ri kaya ra Yayiro .\n", "Ndlela yin’wana leyi hi nga xiximaka Yehova Xikwembu ha yona i ku endla minyikelo ya mali eka ntirho wo chumayela wa misava hinkwawo lowu a wu leriseke .\n", "==> dev.en <==\n", "But , naturally , we also want relief .\n", "3 “ We’re Letting You Go ”\n", "Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "“ My mother told me that I should pray to God , ” said a young girl in São Paulo .\n", "It is 1,506 feet [ 459 m ] long and is still used as part of the network of underground railways serving Greater London .\n", "I found the years that I spent in the Service Department especially enjoyable .\n", "Regarding the heavenly organization , there is no doubt as to the answer to this question .\n", "The God - Given Desire to Serve\n", "Many who pray to Mary have been taught that blessings can be gained by the repetition of set formulas ​ — prayers such as the Hail Mary , Our Father , and others .\n", "\n", "==> dev.ts <==\n", "Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "3 “ Ntirho Wa Wena Wu Herile ”\n", "Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "Xinhwanyatana xin’wana xa le São Paulo xi te : “ Mana wa mina u ndzi byele leswaku ndzi fanele ndzi khongela eka Xikwembu .\n", "Wu lehe 459 wa timitara naswona wa ha tirhisiwa hi switimela leswi tleketlaka vanhu eGreater London .\n", "Malembe lawa ndzi ma heteke eka Ndzawulo ya Ntirho a ma tsakisa swinene .\n", "Malunghana ni nhlengeletano ya le tilweni , nhlamulo ya xivutiso lexi yi le rivaleni .\n", "Nyiko Ya Xikwembu Ya Ku Navela Ku Xi Tirhela\n", "Vo tala lava khongelaka eka Mariya va dyondzisiwe leswaku va nga kuma mikateko loko va phindha - phindha swikhongelo hi ndlela yo hlawuleka — swikhongelo swo fana ni xikhongelo xo Dzunisa Mariya , Xikhongelo xa Hosi ni swin’wana .\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "epeCydmCyS8X" }, "source": [ "\n", "\n", "---\n", "\n", "\n", "## Installation of JoeyNMT\n", "\n", "JoeyNMT is a simple, minimalist NMT package which is useful for learning and teaching. Check out the documentation for JoeyNMT [here](https://joeynmt.readthedocs.io) " ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "iBRMm4kMxZ8L", "outputId": "65c57e50-0e72-4b31-a86e-da0cf82dd69d", "colab": { "base_uri": "https://localhost:8080/", "height": 1000 } }, "source": [ "# Install JoeyNMT\n", "! git clone https://github.com/joeynmt/joeynmt.git\n", "! cd joeynmt; pip3 install ." ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Cloning into 'joeynmt'...\n", "remote: Enumerating objects: 15, done.\u001b[K\n", "remote: Counting objects: 6% (1/15)\u001b[K\rremote: Counting objects: 13% (2/15)\u001b[K\rremote: Counting objects: 20% (3/15)\u001b[K\rremote: Counting objects: 26% (4/15)\u001b[K\rremote: Counting objects: 33% (5/15)\u001b[K\rremote: Counting objects: 40% (6/15)\u001b[K\rremote: Counting objects: 46% (7/15)\u001b[K\rremote: Counting objects: 53% (8/15)\u001b[K\rremote: Counting objects: 60% (9/15)\u001b[K\rremote: Counting objects: 66% (10/15)\u001b[K\rremote: Counting objects: 73% (11/15)\u001b[K\rremote: Counting objects: 80% (12/15)\u001b[K\rremote: Counting objects: 86% (13/15)\u001b[K\rremote: Counting objects: 93% (14/15)\u001b[K\rremote: Counting objects: 100% (15/15)\u001b[K\rremote: Counting objects: 100% (15/15), done.\u001b[K\n", "remote: Compressing objects: 100% (12/12), done.\u001b[K\n", "remote: Total 2199 (delta 4), reused 5 (delta 3), pack-reused 2184\u001b[K\n", "Receiving objects: 100% (2199/2199), 2.60 MiB | 16.87 MiB/s, done.\n", "Resolving deltas: 100% (1525/1525), done.\n", "Processing /content/joeynmt\n", "Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.16.0)\n", "Requirement already satisfied: pillow in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (6.2.2)\n", "Requirement already satisfied: numpy<2.0,>=1.14.5 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.17.5)\n", "Requirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (42.0.2)\n", "Requirement already satisfied: torch>=1.1 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.3.1)\n", "Requirement already satisfied: tensorflow>=1.14 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.15.0)\n", "Requirement already satisfied: torchtext in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.3.1)\n", "Collecting sacrebleu>=1.3.6\n", " Downloading https://files.pythonhosted.org/packages/45/31/1a135b964c169984b27fb2f7a50280fa7f8e6d9d404d8a9e596180487fd1/sacrebleu-1.4.3-py3-none-any.whl\n", "Collecting subword-nmt\n", " Downloading https://files.pythonhosted.org/packages/74/60/6600a7bc09e7ab38bc53a48a20d8cae49b837f93f5842a41fe513a694912/subword_nmt-0.3.7-py2.py3-none-any.whl\n", "Requirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (3.1.2)\n", "Requirement already satisfied: seaborn in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (0.9.0)\n", "Collecting pyyaml>=5.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/3d/d9/ea9816aea31beeadccd03f1f8b625ecf8f645bd66744484d162d84803ce5/PyYAML-5.3.tar.gz (268kB)\n", "\u001b[K |████████████████████████████████| 276kB 9.6MB/s \n", "\u001b[?25hCollecting pylint\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e9/59/43fc36c5ee316bb9aeb7cf5329cdbdca89e5749c34d5602753827c0aa2dc/pylint-2.4.4-py3-none-any.whl (302kB)\n", "\u001b[K |████████████████████████████████| 307kB 42.3MB/s \n", "\u001b[?25hRequirement already satisfied: six==1.12 in /usr/local/lib/python3.6/dist-packages (from joeynmt==0.0.1) (1.12.0)\n", "Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.1.0)\n", "Requirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.1.0)\n", "Requirement already satisfied: tensorboard<1.16.0,>=1.15.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.0)\n", "Requirement already satisfied: gast==0.2.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.2.2)\n", "Requirement already satisfied: tensorflow-estimator==1.15.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.1)\n", "Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.33.6)\n", "Requirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.8.1)\n", "Requirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.10.0)\n", "Requirement already satisfied: wrapt>=1.11.1 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.11.2)\n", "Requirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.15.0)\n", "Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (3.1.0)\n", "Requirement already satisfied: google-pasta>=0.1.6 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.1.8)\n", "Requirement already satisfied: absl-py>=0.7.0 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (0.9.0)\n", "Requirement already satisfied: keras-applications>=1.0.8 in /usr/local/lib/python3.6/dist-packages (from tensorflow>=1.14->joeynmt==0.0.1) (1.0.8)\n", "Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (4.28.1)\n", "Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from torchtext->joeynmt==0.0.1) (2.21.0)\n", "Requirement already satisfied: typing in /usr/local/lib/python3.6/dist-packages (from sacrebleu>=1.3.6->joeynmt==0.0.1) (3.6.6)\n", "Collecting portalocker\n", " Downloading https://files.pythonhosted.org/packages/91/db/7bc703c0760df726839e0699b7f78a4d8217fdc9c7fcb1b51b39c5a22a4e/portalocker-1.5.2-py2.py3-none-any.whl\n", "Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (0.10.0)\n", "Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.4.6)\n", "Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (2.6.1)\n", "Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->joeynmt==0.0.1) (1.1.0)\n", "Requirement already satisfied: scipy>=0.14.0 in /usr/local/lib/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (1.4.1)\n", "Requirement already satisfied: pandas>=0.15.2 in /usr/local/lib/python3.6/dist-packages (from seaborn->joeynmt==0.0.1) (0.25.3)\n", "Collecting mccabe<0.7,>=0.6\n", " Downloading https://files.pythonhosted.org/packages/87/89/479dc97e18549e21354893e4ee4ef36db1d237534982482c3681ee6e7b57/mccabe-0.6.1-py2.py3-none-any.whl\n", "Collecting isort<5,>=4.2.5\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e5/b0/c121fd1fa3419ea9bfd55c7f9c4fedfec5143208d8c7ad3ce3db6c623c21/isort-4.3.21-py2.py3-none-any.whl (42kB)\n", "\u001b[K |████████████████████████████████| 51kB 7.0MB/s \n", "\u001b[?25hCollecting astroid<2.4,>=2.3.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/ad/ae/86734823047962e7b8c8529186a1ac4a7ca19aaf1aa0c7713c022ef593fd/astroid-2.3.3-py3-none-any.whl (205kB)\n", "\u001b[K |████████████████████████████████| 215kB 43.6MB/s \n", "\u001b[?25hRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/dist-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow>=1.14->joeynmt==0.0.1) (3.1.1)\n", "Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/dist-packages (from tensorboard<1.16.0,>=1.15.0->tensorflow>=1.14->joeynmt==0.0.1) (0.16.0)\n", "Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from keras-applications>=1.0.8->tensorflow>=1.14->joeynmt==0.0.1) (2.8.0)\n", "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (1.24.3)\n", "Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2.8)\n", "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (3.0.4)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->torchtext->joeynmt==0.0.1) (2019.11.28)\n", "Requirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas>=0.15.2->seaborn->joeynmt==0.0.1) (2018.9)\n", "Collecting lazy-object-proxy==1.4.*\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/0b/dd/b1e3407e9e6913cf178e506cd0dee818e58694d9a5cd1984e3f6a8b9a10f/lazy_object_proxy-1.4.3-cp36-cp36m-manylinux1_x86_64.whl (55kB)\n", "\u001b[K |████████████████████████████████| 61kB 7.6MB/s \n", "\u001b[?25hCollecting typed-ast<1.5,>=1.4.0; implementation_name == \"cpython\" and python_version < \"3.8\"\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/31/d3/9d1802c161626d0278bafb1ffb32f76b9d01e123881bbf9d91e8ccf28e18/typed_ast-1.4.0-cp36-cp36m-manylinux1_x86_64.whl (736kB)\n", "\u001b[K |████████████████████████████████| 737kB 44.3MB/s \n", "\u001b[?25hBuilding wheels for collected packages: joeynmt, pyyaml\n", " Building wheel for joeynmt (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for joeynmt: filename=joeynmt-0.0.1-cp36-none-any.whl size=72136 sha256=6ef7b5ec1557e6e06124618e158d2ad7a5e9889d63b8b580388f0e1d88790ba6\n", " Stored in directory: /tmp/pip-ephem-wheel-cache-bnh801j_/wheels/db/01/db/751cc9f3e7f6faec127c43644ba250a3ea7ad200594aeda70a\n", " Building wheel for pyyaml (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for pyyaml: filename=PyYAML-5.3-cp36-cp36m-linux_x86_64.whl size=44229 sha256=b196cb43b84dd9c46288cfeca251ce97fe7ff99be615a1376f0bb7b786d3e2b2\n", " Stored in directory: /root/.cache/pip/wheels/e4/76/4d/a95b8dd7b452b69e8ed4f68b69e1b55e12c9c9624dd962b191\n", "Successfully built joeynmt pyyaml\n", "Installing collected packages: portalocker, sacrebleu, subword-nmt, pyyaml, mccabe, isort, lazy-object-proxy, typed-ast, astroid, pylint, joeynmt\n", " Found existing installation: PyYAML 3.13\n", " Uninstalling PyYAML-3.13:\n", " Successfully uninstalled PyYAML-3.13\n", "Successfully installed astroid-2.3.3 isort-4.3.21 joeynmt-0.0.1 lazy-object-proxy-1.4.3 mccabe-0.6.1 portalocker-1.5.2 pylint-2.4.4 pyyaml-5.3 sacrebleu-1.4.3 subword-nmt-0.3.7 typed-ast-1.4.0\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "AaE77Tcppex9" }, "source": [ "# Preprocessing the Data into Subword BPE Tokens\n", "\n", "- One of the most powerful improvements for agglutinative languages (a feature of most Bantu languages) is using BPE tokenization [ (Sennrich, 2015) ](https://arxiv.org/abs/1508.07909).\n", "\n", "- It was also shown that by optimizing the umber of BPE codes we significantly improve results for low-resourced languages [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021) [(Martinus, 2019)](https://arxiv.org/abs/1906.05685)\n", "\n", "- Below we have the scripts for doing BPE tokenization of our data. We use 4000 tokens as recommended by [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021). You do not need to change anything. Simply running the below will be suitable. " ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "H-TyjtmXB1mL", "outputId": "04c476b5-de0d-45ad-f2fb-956a0a604a58", "colab": { "base_uri": "https://localhost:8080/", "height": 403 } }, "source": [ "# One of the huge boosts in NMT performance was to use a different method of tokenizing. \n", "# Usually, NMT would tokenize by words. However, using a method called BPE gave amazing boosts to performance\n", "\n", "# Do subword NMT\n", "from os import path\n", "os.environ[\"src\"] = source_language # Sets them in bash as well, since we often use bash scripts\n", "os.environ[\"tgt\"] = target_language\n", "\n", "# Learn BPEs on the training data.\n", "os.environ[\"data_path\"] = path.join(\"joeynmt\", \"data\", source_language + target_language) # Herman! \n", "! subword-nmt learn-joint-bpe-and-vocab --input train.$src train.$tgt -s 4000 -o bpe.codes.4000 --write-vocabulary vocab.$src vocab.$tgt\n", "\n", "# Apply BPE splits to the development and test data.\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < train.$src > train.bpe.$src\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < train.$tgt > train.bpe.$tgt\n", "\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < dev.$src > dev.bpe.$src\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < dev.$tgt > dev.bpe.$tgt\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$src < test.$src > test.bpe.$src\n", "! subword-nmt apply-bpe -c bpe.codes.4000 --vocabulary vocab.$tgt < test.$tgt > test.bpe.$tgt\n", "\n", "# Create directory, move everyone we care about to the correct location\n", "! mkdir -p $data_path\n", "! cp train.* $data_path\n", "! cp test.* $data_path\n", "! cp dev.* $data_path\n", "! cp bpe.codes.4000 $data_path\n", "! ls $data_path\n", "\n", "# Also move everything we care about to a mounted location in google drive (relevant if running in colab) at gdrive_path\n", "! cp train.* \"$gdrive_path\"\n", "! cp test.* \"$gdrive_path\"\n", "! cp dev.* \"$gdrive_path\"\n", "! cp bpe.codes.4000 \"$gdrive_path\"\n", "! ls \"$gdrive_path\"\n", "\n", "# Create that vocab using build_vocab\n", "! sudo chmod 777 joeynmt/scripts/build_vocab.py\n", "! joeynmt/scripts/build_vocab.py joeynmt/data/$src$tgt/train.bpe.$src joeynmt/data/$src$tgt/train.bpe.$tgt --output_path joeynmt/data/$src$tgt/vocab.txt\n", "\n", "# Some output\n", "! echo \"BPE Xitsonga Sentences\"\n", "! tail -n 5 test.bpe.$tgt\n", "! echo \"Combined BPE Vocab\"\n", "! tail -n 10 joeynmt/data/$src$tgt/vocab.txt # Herman" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "bpe.codes.4000\tdev.en\t test.bpe.ts test.ts\t train.en\n", "dev.bpe.en\tdev.ts\t test.en\t train.bpe.en train.ts\n", "dev.bpe.ts\ttest.bpe.en test.en-any.en train.bpe.ts\n", "bpe.codes.4000\tdev.en\t test.bpe.ts test.ts\t train.en\n", "dev.bpe.en\tdev.ts\t test.en\t train.bpe.en train.ts\n", "dev.bpe.ts\ttest.bpe.en test.en-any.en train.bpe.ts\n", "BPE Xitsonga Sentences\n", "( Hlaya Luka 21 : 34 , 35 . )\n", "I yini leswi nga endleka eka Petro , Yakobo na Yohane leswi nga endle@@ kaka ni le ka hina ?\n", "Hi ku ya hi Luka 21 : 36 , Yesu u hi dyondzisa njhani ku ‘ tshama hi h@@ it@@ ekile ’ ?\n", "Hi nga tiyiseka njhani leswaku hi swi lungh@@ ek@@ erile ku langutana ni leswi nga ta endleka ku nga ri khale ?\n", "[ 1 ] ( ndz@@ imana 14 ) Vona ndz@@ ima 21 ya buku leyi nge , Mfumo Wa Xikwembu Wa F@@ u@@ ma !\n", "Combined BPE Vocab\n", "ά@@\n", ";@@\n", "õ@@\n", "τ@@\n", "̭@@\n", "̌\n", "avulo@@\n", "Ο@@\n", "×\n", "̱\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "IlMitUHR8Qy-", "outputId": "c80e2910-3b89-4b37-843e-f5e1ce146cc0", "colab": { "base_uri": "https://localhost:8080/", "height": 67 } }, "source": [ "# Also move everything we care about to a mounted location in google drive (relevant if running in colab) at gdrive_path\n", "! cp train.* \"$gdrive_path\"\n", "! cp test.* \"$gdrive_path\"\n", "! cp dev.* \"$gdrive_path\"\n", "! cp bpe.codes.4000 \"$gdrive_path\"\n", "! ls \"$gdrive_path\"" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "bpe.codes.4000\tdev.en\t test.bpe.ts test.ts\t train.en\n", "dev.bpe.en\tdev.ts\t test.en\t train.bpe.en train.ts\n", "dev.bpe.ts\ttest.bpe.en test.en-any.en train.bpe.ts\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "Ixmzi60WsUZ8" }, "source": [ "# Creating the JoeyNMT Config\n", "\n", "JoeyNMT requires a yaml config. We provide a template below. We've also set a number of defaults with it, that you may play with!\n", "\n", "- We used Transformer architecture \n", "- We set our dropout to reasonably high: 0.3 (recommended in [(Sennrich, 2019)](https://www.aclweb.org/anthology/P19-1021))\n", "\n", "Things worth playing with:\n", "- The batch size (also recommended to change for low-resourced languages)\n", "- The number of epochs (we've set it at 30 just so it runs in about an hour, for testing purposes)\n", "- The decoder options (beam_size, alpha)\n", "- Evaluation metrics (BLEU versus Crhf4)" ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "PIs1lY2hxMsl", "colab": {} }, "source": [ "# This creates the config file for our JoeyNMT system. It might seem overwhelming so we've provided a couple of useful parameters you'll need to update\n", "# (You can of course play with all the parameters if you'd like!)\n", "\n", "name = '%s%s' % (source_language, target_language)\n", "gdrive_path = os.environ[\"gdrive_path\"]\n", "\n", "# Create the config\n", "config = \"\"\"\n", "name: \"{name}_transformer\"\n", "\n", "data:\n", " src: \"{source_language}\"\n", " trg: \"{target_language}\"\n", " train: \"data/{name}/train.bpe\"\n", " dev: \"data/{name}/dev.bpe\"\n", " test: \"data/{name}/test.bpe\"\n", " level: \"bpe\"\n", " lowercase: False\n", " max_sent_length: 100\n", " src_vocab: \"data/{name}/vocab.txt\"\n", " trg_vocab: \"data/{name}/vocab.txt\"\n", "\n", "testing:\n", " beam_size: 5\n", " alpha: 1.0\n", "\n", "training:\n", " #load_model: \"{gdrive_path}/models/{name}_transformer/1.ckpt\" # if uncommented, load a pre-trained model from this checkpoint\n", " random_seed: 42\n", " optimizer: \"adam\"\n", " normalization: \"tokens\"\n", " adam_betas: [0.9, 0.999] \n", " scheduling: \"plateau\" # TODO: try switching from plateau to Noam scheduling\n", " patience: 5 # For plateau: decrease learning rate by decrease_factor if validation score has not improved for this many validation rounds.\n", " learning_rate_factor: 0.5 # factor for Noam scheduler (used with Transformer)\n", " learning_rate_warmup: 1000 # warmup steps for Noam scheduler (used with Transformer)\n", " decrease_factor: 0.7\n", " loss: \"crossentropy\"\n", " learning_rate: 0.0003\n", " learning_rate_min: 0.00000001\n", " weight_decay: 0.0\n", " label_smoothing: 0.1\n", " batch_size: 4096\n", " batch_type: \"token\"\n", " eval_batch_size: 3600\n", " eval_batch_type: \"token\"\n", " batch_multiplier: 1\n", " early_stopping_metric: \"ppl\"\n", " epochs: 30 # TODO: Decrease for when playing around and checking of working. Around 30 is sufficient to check if its working at all\n", " validation_freq: 1000 # TODO: Set to at least once per epoch.\n", " logging_freq: 100\n", " eval_metric: \"bleu\"\n", " model_dir: \"models/{name}_transformer\"\n", " overwrite: False # TODO: Set to True if you want to overwrite possibly existing models. \n", " shuffle: True\n", " use_cuda: True\n", " max_output_length: 100\n", " print_valid_sents: [0, 1, 2, 3]\n", " keep_last_ckpts: 3\n", "\n", "model:\n", " initializer: \"xavier\"\n", " bias_initializer: \"zeros\"\n", " init_gain: 1.0\n", " embed_initializer: \"xavier\"\n", " embed_init_gain: 1.0\n", " tied_embeddings: True\n", " tied_softmax: True\n", " encoder:\n", " type: \"transformer\"\n", " num_layers: 6\n", " num_heads: 4 # TODO: Increase to 8 for larger data.\n", " embeddings:\n", " embedding_dim: 256 # TODO: Increase to 512 for larger data.\n", " scale: True\n", " dropout: 0.2\n", " # typically ff_size = 4 x hidden_size\n", " hidden_size: 256 # TODO: Increase to 512 for larger data.\n", " ff_size: 1024 # TODO: Increase to 2048 for larger data.\n", " dropout: 0.3\n", " decoder:\n", " type: \"transformer\"\n", " num_layers: 6\n", " num_heads: 4 # TODO: Increase to 8 for larger data.\n", " embeddings:\n", " embedding_dim: 256 # TODO: Increase to 512 for larger data.\n", " scale: True\n", " dropout: 0.2\n", " # typically ff_size = 4 x hidden_size\n", " hidden_size: 256 # TODO: Increase to 512 for larger data.\n", " ff_size: 1024 # TODO: Increase to 2048 for larger data.\n", " dropout: 0.3\n", "\"\"\".format(name=name, gdrive_path=os.environ[\"gdrive_path\"], source_language=source_language, target_language=target_language)\n", "with open(\"joeynmt/configs/transformer_{name}.yaml\".format(name=name),'w') as f:\n", " f.write(config)" ], "execution_count": 0, "outputs": [] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "pIifxE3Qzuvs" }, "source": [ "# Train the Model\n", "\n", "This single line of joeynmt runs the training using the config we made above" ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "6ZBPFwT94WpI", "outputId": "17b8ca90-88be-4855-998b-725896a08a56", "colab": { "base_uri": "https://localhost:8080/", "height": 1000 } }, "source": [ "# Train the model\n", "# You can press Ctrl-C to stop. And then run the next cell to save your checkpoints! \n", "!cd joeynmt; python3 -m joeynmt train configs/transformer_$src$tgt.yaml" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "2020-01-12 13:08:59,113 Hello! This is Joey-NMT.\n", "2020-01-12 13:09:00,527 Total params: 12201472\n", "2020-01-12 13:09:00,528 Trainable parameters: ['decoder.layer_norm.bias', 'decoder.layer_norm.weight', 'decoder.layers.0.dec_layer_norm.bias', 'decoder.layers.0.dec_layer_norm.weight', 'decoder.layers.0.feed_forward.layer_norm.bias', 'decoder.layers.0.feed_forward.layer_norm.weight', 'decoder.layers.0.feed_forward.pwff_layer.0.bias', 'decoder.layers.0.feed_forward.pwff_layer.0.weight', 'decoder.layers.0.feed_forward.pwff_layer.3.bias', 'decoder.layers.0.feed_forward.pwff_layer.3.weight', 'decoder.layers.0.src_trg_att.k_layer.bias', 'decoder.layers.0.src_trg_att.k_layer.weight', 'decoder.layers.0.src_trg_att.output_layer.bias', 'decoder.layers.0.src_trg_att.output_layer.weight', 'decoder.layers.0.src_trg_att.q_layer.bias', 'decoder.layers.0.src_trg_att.q_layer.weight', 'decoder.layers.0.src_trg_att.v_layer.bias', 'decoder.layers.0.src_trg_att.v_layer.weight', 'decoder.layers.0.trg_trg_att.k_layer.bias', 'decoder.layers.0.trg_trg_att.k_layer.weight', 'decoder.layers.0.trg_trg_att.output_layer.bias', 'decoder.layers.0.trg_trg_att.output_layer.weight', 'decoder.layers.0.trg_trg_att.q_layer.bias', 'decoder.layers.0.trg_trg_att.q_layer.weight', 'decoder.layers.0.trg_trg_att.v_layer.bias', 'decoder.layers.0.trg_trg_att.v_layer.weight', 'decoder.layers.0.x_layer_norm.bias', 'decoder.layers.0.x_layer_norm.weight', 'decoder.layers.1.dec_layer_norm.bias', 'decoder.layers.1.dec_layer_norm.weight', 'decoder.layers.1.feed_forward.layer_norm.bias', 'decoder.layers.1.feed_forward.layer_norm.weight', 'decoder.layers.1.feed_forward.pwff_layer.0.bias', 'decoder.layers.1.feed_forward.pwff_layer.0.weight', 'decoder.layers.1.feed_forward.pwff_layer.3.bias', 'decoder.layers.1.feed_forward.pwff_layer.3.weight', 'decoder.layers.1.src_trg_att.k_layer.bias', 'decoder.layers.1.src_trg_att.k_layer.weight', 'decoder.layers.1.src_trg_att.output_layer.bias', 'decoder.layers.1.src_trg_att.output_layer.weight', 'decoder.layers.1.src_trg_att.q_layer.bias', 'decoder.layers.1.src_trg_att.q_layer.weight', 'decoder.layers.1.src_trg_att.v_layer.bias', 'decoder.layers.1.src_trg_att.v_layer.weight', 'decoder.layers.1.trg_trg_att.k_layer.bias', 'decoder.layers.1.trg_trg_att.k_layer.weight', 'decoder.layers.1.trg_trg_att.output_layer.bias', 'decoder.layers.1.trg_trg_att.output_layer.weight', 'decoder.layers.1.trg_trg_att.q_layer.bias', 'decoder.layers.1.trg_trg_att.q_layer.weight', 'decoder.layers.1.trg_trg_att.v_layer.bias', 'decoder.layers.1.trg_trg_att.v_layer.weight', 'decoder.layers.1.x_layer_norm.bias', 'decoder.layers.1.x_layer_norm.weight', 'decoder.layers.2.dec_layer_norm.bias', 'decoder.layers.2.dec_layer_norm.weight', 'decoder.layers.2.feed_forward.layer_norm.bias', 'decoder.layers.2.feed_forward.layer_norm.weight', 'decoder.layers.2.feed_forward.pwff_layer.0.bias', 'decoder.layers.2.feed_forward.pwff_layer.0.weight', 'decoder.layers.2.feed_forward.pwff_layer.3.bias', 'decoder.layers.2.feed_forward.pwff_layer.3.weight', 'decoder.layers.2.src_trg_att.k_layer.bias', 'decoder.layers.2.src_trg_att.k_layer.weight', 'decoder.layers.2.src_trg_att.output_layer.bias', 'decoder.layers.2.src_trg_att.output_layer.weight', 'decoder.layers.2.src_trg_att.q_layer.bias', 'decoder.layers.2.src_trg_att.q_layer.weight', 'decoder.layers.2.src_trg_att.v_layer.bias', 'decoder.layers.2.src_trg_att.v_layer.weight', 'decoder.layers.2.trg_trg_att.k_layer.bias', 'decoder.layers.2.trg_trg_att.k_layer.weight', 'decoder.layers.2.trg_trg_att.output_layer.bias', 'decoder.layers.2.trg_trg_att.output_layer.weight', 'decoder.layers.2.trg_trg_att.q_layer.bias', 'decoder.layers.2.trg_trg_att.q_layer.weight', 'decoder.layers.2.trg_trg_att.v_layer.bias', 'decoder.layers.2.trg_trg_att.v_layer.weight', 'decoder.layers.2.x_layer_norm.bias', 'decoder.layers.2.x_layer_norm.weight', 'decoder.layers.3.dec_layer_norm.bias', 'decoder.layers.3.dec_layer_norm.weight', 'decoder.layers.3.feed_forward.layer_norm.bias', 'decoder.layers.3.feed_forward.layer_norm.weight', 'decoder.layers.3.feed_forward.pwff_layer.0.bias', 'decoder.layers.3.feed_forward.pwff_layer.0.weight', 'decoder.layers.3.feed_forward.pwff_layer.3.bias', 'decoder.layers.3.feed_forward.pwff_layer.3.weight', 'decoder.layers.3.src_trg_att.k_layer.bias', 'decoder.layers.3.src_trg_att.k_layer.weight', 'decoder.layers.3.src_trg_att.output_layer.bias', 'decoder.layers.3.src_trg_att.output_layer.weight', 'decoder.layers.3.src_trg_att.q_layer.bias', 'decoder.layers.3.src_trg_att.q_layer.weight', 'decoder.layers.3.src_trg_att.v_layer.bias', 'decoder.layers.3.src_trg_att.v_layer.weight', 'decoder.layers.3.trg_trg_att.k_layer.bias', 'decoder.layers.3.trg_trg_att.k_layer.weight', 'decoder.layers.3.trg_trg_att.output_layer.bias', 'decoder.layers.3.trg_trg_att.output_layer.weight', 'decoder.layers.3.trg_trg_att.q_layer.bias', 'decoder.layers.3.trg_trg_att.q_layer.weight', 'decoder.layers.3.trg_trg_att.v_layer.bias', 'decoder.layers.3.trg_trg_att.v_layer.weight', 'decoder.layers.3.x_layer_norm.bias', 'decoder.layers.3.x_layer_norm.weight', 'decoder.layers.4.dec_layer_norm.bias', 'decoder.layers.4.dec_layer_norm.weight', 'decoder.layers.4.feed_forward.layer_norm.bias', 'decoder.layers.4.feed_forward.layer_norm.weight', 'decoder.layers.4.feed_forward.pwff_layer.0.bias', 'decoder.layers.4.feed_forward.pwff_layer.0.weight', 'decoder.layers.4.feed_forward.pwff_layer.3.bias', 'decoder.layers.4.feed_forward.pwff_layer.3.weight', 'decoder.layers.4.src_trg_att.k_layer.bias', 'decoder.layers.4.src_trg_att.k_layer.weight', 'decoder.layers.4.src_trg_att.output_layer.bias', 'decoder.layers.4.src_trg_att.output_layer.weight', 'decoder.layers.4.src_trg_att.q_layer.bias', 'decoder.layers.4.src_trg_att.q_layer.weight', 'decoder.layers.4.src_trg_att.v_layer.bias', 'decoder.layers.4.src_trg_att.v_layer.weight', 'decoder.layers.4.trg_trg_att.k_layer.bias', 'decoder.layers.4.trg_trg_att.k_layer.weight', 'decoder.layers.4.trg_trg_att.output_layer.bias', 'decoder.layers.4.trg_trg_att.output_layer.weight', 'decoder.layers.4.trg_trg_att.q_layer.bias', 'decoder.layers.4.trg_trg_att.q_layer.weight', 'decoder.layers.4.trg_trg_att.v_layer.bias', 'decoder.layers.4.trg_trg_att.v_layer.weight', 'decoder.layers.4.x_layer_norm.bias', 'decoder.layers.4.x_layer_norm.weight', 'decoder.layers.5.dec_layer_norm.bias', 'decoder.layers.5.dec_layer_norm.weight', 'decoder.layers.5.feed_forward.layer_norm.bias', 'decoder.layers.5.feed_forward.layer_norm.weight', 'decoder.layers.5.feed_forward.pwff_layer.0.bias', 'decoder.layers.5.feed_forward.pwff_layer.0.weight', 'decoder.layers.5.feed_forward.pwff_layer.3.bias', 'decoder.layers.5.feed_forward.pwff_layer.3.weight', 'decoder.layers.5.src_trg_att.k_layer.bias', 'decoder.layers.5.src_trg_att.k_layer.weight', 'decoder.layers.5.src_trg_att.output_layer.bias', 'decoder.layers.5.src_trg_att.output_layer.weight', 'decoder.layers.5.src_trg_att.q_layer.bias', 'decoder.layers.5.src_trg_att.q_layer.weight', 'decoder.layers.5.src_trg_att.v_layer.bias', 'decoder.layers.5.src_trg_att.v_layer.weight', 'decoder.layers.5.trg_trg_att.k_layer.bias', 'decoder.layers.5.trg_trg_att.k_layer.weight', 'decoder.layers.5.trg_trg_att.output_layer.bias', 'decoder.layers.5.trg_trg_att.output_layer.weight', 'decoder.layers.5.trg_trg_att.q_layer.bias', 'decoder.layers.5.trg_trg_att.q_layer.weight', 'decoder.layers.5.trg_trg_att.v_layer.bias', 'decoder.layers.5.trg_trg_att.v_layer.weight', 'decoder.layers.5.x_layer_norm.bias', 'decoder.layers.5.x_layer_norm.weight', 'encoder.layer_norm.bias', 'encoder.layer_norm.weight', 'encoder.layers.0.feed_forward.layer_norm.bias', 'encoder.layers.0.feed_forward.layer_norm.weight', 'encoder.layers.0.feed_forward.pwff_layer.0.bias', 'encoder.layers.0.feed_forward.pwff_layer.0.weight', 'encoder.layers.0.feed_forward.pwff_layer.3.bias', 'encoder.layers.0.feed_forward.pwff_layer.3.weight', 'encoder.layers.0.layer_norm.bias', 'encoder.layers.0.layer_norm.weight', 'encoder.layers.0.src_src_att.k_layer.bias', 'encoder.layers.0.src_src_att.k_layer.weight', 'encoder.layers.0.src_src_att.output_layer.bias', 'encoder.layers.0.src_src_att.output_layer.weight', 'encoder.layers.0.src_src_att.q_layer.bias', 'encoder.layers.0.src_src_att.q_layer.weight', 'encoder.layers.0.src_src_att.v_layer.bias', 'encoder.layers.0.src_src_att.v_layer.weight', 'encoder.layers.1.feed_forward.layer_norm.bias', 'encoder.layers.1.feed_forward.layer_norm.weight', 'encoder.layers.1.feed_forward.pwff_layer.0.bias', 'encoder.layers.1.feed_forward.pwff_layer.0.weight', 'encoder.layers.1.feed_forward.pwff_layer.3.bias', 'encoder.layers.1.feed_forward.pwff_layer.3.weight', 'encoder.layers.1.layer_norm.bias', 'encoder.layers.1.layer_norm.weight', 'encoder.layers.1.src_src_att.k_layer.bias', 'encoder.layers.1.src_src_att.k_layer.weight', 'encoder.layers.1.src_src_att.output_layer.bias', 'encoder.layers.1.src_src_att.output_layer.weight', 'encoder.layers.1.src_src_att.q_layer.bias', 'encoder.layers.1.src_src_att.q_layer.weight', 'encoder.layers.1.src_src_att.v_layer.bias', 'encoder.layers.1.src_src_att.v_layer.weight', 'encoder.layers.2.feed_forward.layer_norm.bias', 'encoder.layers.2.feed_forward.layer_norm.weight', 'encoder.layers.2.feed_forward.pwff_layer.0.bias', 'encoder.layers.2.feed_forward.pwff_layer.0.weight', 'encoder.layers.2.feed_forward.pwff_layer.3.bias', 'encoder.layers.2.feed_forward.pwff_layer.3.weight', 'encoder.layers.2.layer_norm.bias', 'encoder.layers.2.layer_norm.weight', 'encoder.layers.2.src_src_att.k_layer.bias', 'encoder.layers.2.src_src_att.k_layer.weight', 'encoder.layers.2.src_src_att.output_layer.bias', 'encoder.layers.2.src_src_att.output_layer.weight', 'encoder.layers.2.src_src_att.q_layer.bias', 'encoder.layers.2.src_src_att.q_layer.weight', 'encoder.layers.2.src_src_att.v_layer.bias', 'encoder.layers.2.src_src_att.v_layer.weight', 'encoder.layers.3.feed_forward.layer_norm.bias', 'encoder.layers.3.feed_forward.layer_norm.weight', 'encoder.layers.3.feed_forward.pwff_layer.0.bias', 'encoder.layers.3.feed_forward.pwff_layer.0.weight', 'encoder.layers.3.feed_forward.pwff_layer.3.bias', 'encoder.layers.3.feed_forward.pwff_layer.3.weight', 'encoder.layers.3.layer_norm.bias', 'encoder.layers.3.layer_norm.weight', 'encoder.layers.3.src_src_att.k_layer.bias', 'encoder.layers.3.src_src_att.k_layer.weight', 'encoder.layers.3.src_src_att.output_layer.bias', 'encoder.layers.3.src_src_att.output_layer.weight', 'encoder.layers.3.src_src_att.q_layer.bias', 'encoder.layers.3.src_src_att.q_layer.weight', 'encoder.layers.3.src_src_att.v_layer.bias', 'encoder.layers.3.src_src_att.v_layer.weight', 'encoder.layers.4.feed_forward.layer_norm.bias', 'encoder.layers.4.feed_forward.layer_norm.weight', 'encoder.layers.4.feed_forward.pwff_layer.0.bias', 'encoder.layers.4.feed_forward.pwff_layer.0.weight', 'encoder.layers.4.feed_forward.pwff_layer.3.bias', 'encoder.layers.4.feed_forward.pwff_layer.3.weight', 'encoder.layers.4.layer_norm.bias', 'encoder.layers.4.layer_norm.weight', 'encoder.layers.4.src_src_att.k_layer.bias', 'encoder.layers.4.src_src_att.k_layer.weight', 'encoder.layers.4.src_src_att.output_layer.bias', 'encoder.layers.4.src_src_att.output_layer.weight', 'encoder.layers.4.src_src_att.q_layer.bias', 'encoder.layers.4.src_src_att.q_layer.weight', 'encoder.layers.4.src_src_att.v_layer.bias', 'encoder.layers.4.src_src_att.v_layer.weight', 'encoder.layers.5.feed_forward.layer_norm.bias', 'encoder.layers.5.feed_forward.layer_norm.weight', 'encoder.layers.5.feed_forward.pwff_layer.0.bias', 'encoder.layers.5.feed_forward.pwff_layer.0.weight', 'encoder.layers.5.feed_forward.pwff_layer.3.bias', 'encoder.layers.5.feed_forward.pwff_layer.3.weight', 'encoder.layers.5.layer_norm.bias', 'encoder.layers.5.layer_norm.weight', 'encoder.layers.5.src_src_att.k_layer.bias', 'encoder.layers.5.src_src_att.k_layer.weight', 'encoder.layers.5.src_src_att.output_layer.bias', 'encoder.layers.5.src_src_att.output_layer.weight', 'encoder.layers.5.src_src_att.q_layer.bias', 'encoder.layers.5.src_src_att.q_layer.weight', 'encoder.layers.5.src_src_att.v_layer.bias', 'encoder.layers.5.src_src_att.v_layer.weight', 'src_embed.lut.weight']\n", "2020-01-12 13:09:09,672 cfg.name : ents_transformer\n", "2020-01-12 13:09:09,673 cfg.data.src : en\n", "2020-01-12 13:09:09,673 cfg.data.trg : ts\n", "2020-01-12 13:09:09,673 cfg.data.train : data/ents/train.bpe\n", "2020-01-12 13:09:09,673 cfg.data.dev : data/ents/dev.bpe\n", "2020-01-12 13:09:09,673 cfg.data.test : data/ents/test.bpe\n", "2020-01-12 13:09:09,673 cfg.data.level : bpe\n", "2020-01-12 13:09:09,673 cfg.data.lowercase : False\n", "2020-01-12 13:09:09,673 cfg.data.max_sent_length : 100\n", "2020-01-12 13:09:09,673 cfg.data.src_vocab : data/ents/vocab.txt\n", "2020-01-12 13:09:09,673 cfg.data.trg_vocab : data/ents/vocab.txt\n", "2020-01-12 13:09:09,673 cfg.testing.beam_size : 5\n", "2020-01-12 13:09:09,673 cfg.testing.alpha : 1.0\n", "2020-01-12 13:09:09,673 cfg.training.random_seed : 42\n", "2020-01-12 13:09:09,674 cfg.training.optimizer : adam\n", "2020-01-12 13:09:09,674 cfg.training.normalization : tokens\n", "2020-01-12 13:09:09,674 cfg.training.adam_betas : [0.9, 0.999]\n", "2020-01-12 13:09:09,674 cfg.training.scheduling : plateau\n", "2020-01-12 13:09:09,674 cfg.training.patience : 5\n", "2020-01-12 13:09:09,674 cfg.training.learning_rate_factor : 0.5\n", "2020-01-12 13:09:09,674 cfg.training.learning_rate_warmup : 1000\n", "2020-01-12 13:09:09,674 cfg.training.decrease_factor : 0.7\n", "2020-01-12 13:09:09,674 cfg.training.loss : crossentropy\n", "2020-01-12 13:09:09,674 cfg.training.learning_rate : 0.0003\n", "2020-01-12 13:09:09,674 cfg.training.learning_rate_min : 1e-08\n", "2020-01-12 13:09:09,674 cfg.training.weight_decay : 0.0\n", "2020-01-12 13:09:09,675 cfg.training.label_smoothing : 0.1\n", "2020-01-12 13:09:09,675 cfg.training.batch_size : 4096\n", "2020-01-12 13:09:09,675 cfg.training.batch_type : token\n", "2020-01-12 13:09:09,675 cfg.training.eval_batch_size : 3600\n", "2020-01-12 13:09:09,675 cfg.training.eval_batch_type : token\n", "2020-01-12 13:09:09,675 cfg.training.batch_multiplier : 1\n", "2020-01-12 13:09:09,675 cfg.training.early_stopping_metric : ppl\n", "2020-01-12 13:09:09,675 cfg.training.epochs : 30\n", "2020-01-12 13:09:09,675 cfg.training.validation_freq : 1000\n", "2020-01-12 13:09:09,675 cfg.training.logging_freq : 100\n", "2020-01-12 13:09:09,675 cfg.training.eval_metric : bleu\n", "2020-01-12 13:09:09,675 cfg.training.model_dir : models/ents_transformer\n", "2020-01-12 13:09:09,675 cfg.training.overwrite : False\n", "2020-01-12 13:09:09,675 cfg.training.shuffle : True\n", "2020-01-12 13:09:09,675 cfg.training.use_cuda : True\n", "2020-01-12 13:09:09,675 cfg.training.max_output_length : 100\n", "2020-01-12 13:09:09,676 cfg.training.print_valid_sents : [0, 1, 2, 3]\n", "2020-01-12 13:09:09,676 cfg.training.keep_last_ckpts : 3\n", "2020-01-12 13:09:09,676 cfg.model.initializer : xavier\n", "2020-01-12 13:09:09,676 cfg.model.bias_initializer : zeros\n", "2020-01-12 13:09:09,676 cfg.model.init_gain : 1.0\n", "2020-01-12 13:09:09,676 cfg.model.embed_initializer : xavier\n", "2020-01-12 13:09:09,676 cfg.model.embed_init_gain : 1.0\n", "2020-01-12 13:09:09,676 cfg.model.tied_embeddings : True\n", "2020-01-12 13:09:09,676 cfg.model.tied_softmax : True\n", "2020-01-12 13:09:09,676 cfg.model.encoder.type : transformer\n", "2020-01-12 13:09:09,676 cfg.model.encoder.num_layers : 6\n", "2020-01-12 13:09:09,676 cfg.model.encoder.num_heads : 4\n", "2020-01-12 13:09:09,676 cfg.model.encoder.embeddings.embedding_dim : 256\n", "2020-01-12 13:09:09,676 cfg.model.encoder.embeddings.scale : True\n", "2020-01-12 13:09:09,676 cfg.model.encoder.embeddings.dropout : 0.2\n", "2020-01-12 13:09:09,676 cfg.model.encoder.hidden_size : 256\n", "2020-01-12 13:09:09,677 cfg.model.encoder.ff_size : 1024\n", "2020-01-12 13:09:09,677 cfg.model.encoder.dropout : 0.3\n", "2020-01-12 13:09:09,677 cfg.model.decoder.type : transformer\n", "2020-01-12 13:09:09,677 cfg.model.decoder.num_layers : 6\n", "2020-01-12 13:09:09,677 cfg.model.decoder.num_heads : 4\n", "2020-01-12 13:09:09,677 cfg.model.decoder.embeddings.embedding_dim : 256\n", "2020-01-12 13:09:09,677 cfg.model.decoder.embeddings.scale : True\n", "2020-01-12 13:09:09,677 cfg.model.decoder.embeddings.dropout : 0.2\n", "2020-01-12 13:09:09,677 cfg.model.decoder.hidden_size : 256\n", "2020-01-12 13:09:09,677 cfg.model.decoder.ff_size : 1024\n", "2020-01-12 13:09:09,677 cfg.model.decoder.dropout : 0.3\n", "2020-01-12 13:09:09,677 Data set sizes: \n", "\ttrain 765609,\n", "\tvalid 1000,\n", "\ttest 2711\n", "2020-01-12 13:09:09,678 First training example:\n", "\t[SRC] However , the real cause of their dis@@ tre@@ ss went far de@@ ep@@ er .\n", "\t[TRG] Hambi swi ri tano , xivangelo xa ntiyiso xa gome ra vona xi ye xi ny@@ anya .\n", "2020-01-12 13:09:09,678 First 10 words (src): (0) (1) (2) (3) (4) . (5) , (6) a (7) ku (8) the (9) hi\n", "2020-01-12 13:09:09,678 First 10 words (trg): (0) (1) (2) (3) (4) . (5) , (6) a (7) ku (8) the (9) hi\n", "2020-01-12 13:09:09,678 Number of Src words (types): 4458\n", "2020-01-12 13:09:09,679 Number of Trg words (types): 4458\n", "2020-01-12 13:09:09,679 Model(\n", "\tencoder=TransformerEncoder(num_layers=6, num_heads=4),\n", "\tdecoder=TransformerDecoder(num_layers=6, num_heads=4),\n", "\tsrc_embed=Embeddings(embedding_dim=256, vocab_size=4458),\n", "\ttrg_embed=Embeddings(embedding_dim=256, vocab_size=4458))\n", "2020-01-12 13:09:09,683 EPOCH 1\n", "2020-01-12 13:09:25,467 Epoch 1 Step: 100 Batch Loss: 5.637402 Tokens per Sec: 15841, Lr: 0.000300\n", "2020-01-12 13:09:39,812 Epoch 1 Step: 200 Batch Loss: 5.210485 Tokens per Sec: 17659, Lr: 0.000300\n", "2020-01-12 13:09:54,220 Epoch 1 Step: 300 Batch Loss: 5.204889 Tokens per Sec: 17574, Lr: 0.000300\n", "2020-01-12 13:10:08,621 Epoch 1 Step: 400 Batch Loss: 5.102695 Tokens per Sec: 17477, Lr: 0.000300\n", "2020-01-12 13:10:23,038 Epoch 1 Step: 500 Batch Loss: 4.440761 Tokens per Sec: 17507, Lr: 0.000300\n", "2020-01-12 13:10:37,321 Epoch 1 Step: 600 Batch Loss: 4.427644 Tokens per Sec: 17317, Lr: 0.000300\n", "2020-01-12 13:10:51,659 Epoch 1 Step: 700 Batch Loss: 4.534369 Tokens per Sec: 17302, Lr: 0.000300\n", "2020-01-12 13:11:05,953 Epoch 1 Step: 800 Batch Loss: 4.246399 Tokens per Sec: 17339, Lr: 0.000300\n", "2020-01-12 13:11:20,203 Epoch 1 Step: 900 Batch Loss: 3.882329 Tokens per Sec: 17116, Lr: 0.000300\n", "2020-01-12 13:11:34,475 Epoch 1 Step: 1000 Batch Loss: 4.228528 Tokens per Sec: 17219, Lr: 0.000300\n", "2020-01-12 13:12:11,961 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:12:11,962 Saving new checkpoint.\n", "2020-01-12 13:12:12,340 Example #0\n", "2020-01-12 13:12:12,340 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:12:12,341 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:12:12,341 \tHypothesis: Loko a nga ha ri ni ku ri ni ku nga ha yona , a va ri ni ku nga ha ri ni ku va ni ku va ni ku va ni ku va ni ku va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va va\n", "2020-01-12 13:12:12,341 Example #1\n", "2020-01-12 13:12:12,341 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:12:12,341 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:12:12,341 \tHypothesis: “ “ Ku Kingela Nyelo Wa Nyelo Wa Nyelo Wa Nyelo Wa Nyelo Wa Nyelo Wa Nyelo Wa Nyelo\n", "2020-01-12 13:12:12,341 Example #2\n", "2020-01-12 13:12:12,341 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:12:12,341 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:12:12,341 \tHypothesis: Hi ku nga ha ri ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku\n", "2020-01-12 13:12:12,342 Example #3\n", "2020-01-12 13:12:12,342 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:12:12,342 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:12:12,342 \tHypothesis: Ku nga ha ri ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku nga ha yona .\n", "2020-01-12 13:12:12,342 Validation result (greedy) at epoch 1, step 1000: bleu: 0.29, loss: 115859.9531, ppl: 50.5687, duration: 37.8667s\n", "2020-01-12 13:12:26,769 Epoch 1 Step: 1100 Batch Loss: 3.933003 Tokens per Sec: 17339, Lr: 0.000300\n", "2020-01-12 13:12:41,087 Epoch 1 Step: 1200 Batch Loss: 3.740966 Tokens per Sec: 17711, Lr: 0.000300\n", "2020-01-12 13:12:55,466 Epoch 1 Step: 1300 Batch Loss: 3.640759 Tokens per Sec: 17213, Lr: 0.000300\n", "2020-01-12 13:13:09,790 Epoch 1 Step: 1400 Batch Loss: 3.606798 Tokens per Sec: 17496, Lr: 0.000300\n", "2020-01-12 13:13:24,173 Epoch 1 Step: 1500 Batch Loss: 3.369590 Tokens per Sec: 17368, Lr: 0.000300\n", "2020-01-12 13:13:38,530 Epoch 1 Step: 1600 Batch Loss: 3.302809 Tokens per Sec: 17531, Lr: 0.000300\n", "2020-01-12 13:13:52,870 Epoch 1 Step: 1700 Batch Loss: 3.619540 Tokens per Sec: 17572, Lr: 0.000300\n", "2020-01-12 13:14:07,287 Epoch 1 Step: 1800 Batch Loss: 3.081284 Tokens per Sec: 17665, Lr: 0.000300\n", "2020-01-12 13:14:21,659 Epoch 1 Step: 1900 Batch Loss: 3.422733 Tokens per Sec: 17245, Lr: 0.000300\n", "2020-01-12 13:14:36,063 Epoch 1 Step: 2000 Batch Loss: 3.106571 Tokens per Sec: 17750, Lr: 0.000300\n", "2020-01-12 13:15:13,265 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:15:13,265 Saving new checkpoint.\n", "2020-01-12 13:15:13,588 Example #0\n", "2020-01-12 13:15:13,588 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:15:13,588 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:15:13,588 \tHypothesis: Kambe , hi nga ha va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni .\n", "2020-01-12 13:15:13,589 Example #1\n", "2020-01-12 13:15:13,589 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:15:13,589 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:15:13,589 \tHypothesis: 1 : 1\n", "2020-01-12 13:15:13,589 Example #2\n", "2020-01-12 13:15:13,589 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:15:13,589 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:15:13,590 \tHypothesis: Ku nga khathariseki leswaku hi endla leswaku hi endla leswaku hi endla leswaku hi endla leswaku hi endla leswaku hi endla leswaku hi endla leswaku hi endla swilo leswi hi nga ni ku endla swilo swo biha , hi nga ni ku endla swilo leswi nga ni ku endla swilo swa nkoka .\n", "2020-01-12 13:15:13,590 Example #3\n", "2020-01-12 13:15:13,590 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:15:13,590 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:15:13,590 \tHypothesis: Ku fana ni ku tirhisa Xikwembu ni ku endla leswaku xi endla leswaku xi va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni\n", "2020-01-12 13:15:13,590 Validation result (greedy) at epoch 1, step 2000: bleu: 1.25, loss: 93454.9219, ppl: 23.6802, duration: 37.5274s\n", "2020-01-12 13:15:28,031 Epoch 1 Step: 2100 Batch Loss: 3.358097 Tokens per Sec: 17667, Lr: 0.000300\n", "2020-01-12 13:15:42,362 Epoch 1 Step: 2200 Batch Loss: 2.923495 Tokens per Sec: 17225, Lr: 0.000300\n", "2020-01-12 13:15:56,772 Epoch 1 Step: 2300 Batch Loss: 3.491554 Tokens per Sec: 17395, Lr: 0.000300\n", "2020-01-12 13:16:11,225 Epoch 1 Step: 2400 Batch Loss: 3.314225 Tokens per Sec: 17373, Lr: 0.000300\n", "2020-01-12 13:16:25,585 Epoch 1 Step: 2500 Batch Loss: 3.091627 Tokens per Sec: 17376, Lr: 0.000300\n", "2020-01-12 13:16:39,972 Epoch 1 Step: 2600 Batch Loss: 2.980924 Tokens per Sec: 17645, Lr: 0.000300\n", "2020-01-12 13:16:54,356 Epoch 1 Step: 2700 Batch Loss: 3.150299 Tokens per Sec: 17390, Lr: 0.000300\n", "2020-01-12 13:17:08,658 Epoch 1 Step: 2800 Batch Loss: 2.667620 Tokens per Sec: 17310, Lr: 0.000300\n", "2020-01-12 13:17:22,955 Epoch 1 Step: 2900 Batch Loss: 3.085295 Tokens per Sec: 17406, Lr: 0.000300\n", "2020-01-12 13:17:37,313 Epoch 1 Step: 3000 Batch Loss: 2.878503 Tokens per Sec: 17509, Lr: 0.000300\n", "2020-01-12 13:18:14,525 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:18:14,525 Saving new checkpoint.\n", "2020-01-12 13:18:14,866 Example #0\n", "2020-01-12 13:18:14,867 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:18:14,867 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:18:14,867 \tHypothesis: Kambe , hi fanele hi va ni ntshembo lowu .\n", "2020-01-12 13:18:14,867 Example #1\n", "2020-01-12 13:18:14,867 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:18:14,867 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:18:14,867 \tHypothesis: 3 “ Ku Ti Ku Ti Ku Ti Ku Ti Ku Ti Njhani ”\n", "2020-01-12 13:18:14,867 Example #2\n", "2020-01-12 13:18:14,867 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:18:14,868 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:18:14,868 \tHypothesis: Ku nga khathariseki leswaku hi nga ha va ni vutshunguri bya hina , hi fanele hi va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ntshembo wa ku rhula .\n", "2020-01-12 13:18:14,868 Example #3\n", "2020-01-12 13:18:14,868 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:18:14,868 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:18:14,868 \tHypothesis: Ku nga khathariseki leswaku ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni vutshembeki\n", "2020-01-12 13:18:14,868 Validation result (greedy) at epoch 1, step 3000: bleu: 3.86, loss: 82517.1562, ppl: 16.3505, duration: 37.5549s\n", "2020-01-12 13:18:29,231 Epoch 1 Step: 3100 Batch Loss: 2.989350 Tokens per Sec: 17415, Lr: 0.000300\n", "2020-01-12 13:18:43,555 Epoch 1 Step: 3200 Batch Loss: 2.734597 Tokens per Sec: 17162, Lr: 0.000300\n", "2020-01-12 13:18:57,931 Epoch 1 Step: 3300 Batch Loss: 3.273831 Tokens per Sec: 17695, Lr: 0.000300\n", "2020-01-12 13:19:12,072 Epoch 1 Step: 3400 Batch Loss: 2.758227 Tokens per Sec: 17213, Lr: 0.000300\n", "2020-01-12 13:19:26,405 Epoch 1 Step: 3500 Batch Loss: 2.862200 Tokens per Sec: 17327, Lr: 0.000300\n", "2020-01-12 13:19:40,653 Epoch 1 Step: 3600 Batch Loss: 2.980892 Tokens per Sec: 17295, Lr: 0.000300\n", "2020-01-12 13:19:55,156 Epoch 1 Step: 3700 Batch Loss: 2.460630 Tokens per Sec: 17628, Lr: 0.000300\n", "2020-01-12 13:20:09,615 Epoch 1 Step: 3800 Batch Loss: 2.778854 Tokens per Sec: 17698, Lr: 0.000300\n", "2020-01-12 13:20:24,043 Epoch 1 Step: 3900 Batch Loss: 2.893207 Tokens per Sec: 17357, Lr: 0.000300\n", "2020-01-12 13:20:38,319 Epoch 1 Step: 4000 Batch Loss: 2.660086 Tokens per Sec: 17510, Lr: 0.000300\n", "2020-01-12 13:21:15,615 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:21:15,615 Saving new checkpoint.\n", "2020-01-12 13:21:16,027 Example #0\n", "2020-01-12 13:21:16,027 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:21:16,027 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:21:16,027 \tHypothesis: Kambe , hi fanele hi va ni ku va ni ku va ni ku va ni ku va ni ku va ni ku va ni vutshila .\n", "2020-01-12 13:21:16,027 Example #1\n", "2020-01-12 13:21:16,028 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:21:16,028 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:21:16,028 \tHypothesis: 3 “ Ku Tlwela Ku Tshembeka ”\n", "2020-01-12 13:21:16,028 Example #2\n", "2020-01-12 13:21:16,028 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:21:16,028 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:21:16,028 \tHypothesis: Ku va ni swivangelo swa hina swa nkoka , hi fanele hi va ni vuxaka bya hina , hi fanele hi va ni vuxaka bya ku va ni vuxaka bya munhu , ku nga khathariseki leswaku vanhu va ni vululami .\n", "2020-01-12 13:21:16,028 Example #3\n", "2020-01-12 13:21:16,028 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:21:16,028 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:21:16,029 \tHypothesis: Ku va ni vuhomboloki lebyi nga ni ku tinyiketela ka Xikwembu ni ku va ni ku va ni ku va ni xirhambo xa ku va mulanguteri wa mulanguteri wa mulanguteri\n", "2020-01-12 13:21:16,029 Validation result (greedy) at epoch 1, step 4000: bleu: 7.51, loss: 74959.9141, ppl: 12.6587, duration: 37.7091s\n", "2020-01-12 13:21:30,485 Epoch 1 Step: 4100 Batch Loss: 2.579408 Tokens per Sec: 17612, Lr: 0.000300\n", "2020-01-12 13:21:44,877 Epoch 1 Step: 4200 Batch Loss: 2.548224 Tokens per Sec: 17407, Lr: 0.000300\n", "2020-01-12 13:21:59,246 Epoch 1 Step: 4300 Batch Loss: 2.446236 Tokens per Sec: 17171, Lr: 0.000300\n", "2020-01-12 13:22:13,698 Epoch 1 Step: 4400 Batch Loss: 2.601321 Tokens per Sec: 17595, Lr: 0.000300\n", "2020-01-12 13:22:28,046 Epoch 1 Step: 4500 Batch Loss: 2.866036 Tokens per Sec: 17540, Lr: 0.000300\n", "2020-01-12 13:22:42,475 Epoch 1 Step: 4600 Batch Loss: 2.563618 Tokens per Sec: 17545, Lr: 0.000300\n", "2020-01-12 13:22:56,829 Epoch 1 Step: 4700 Batch Loss: 2.339271 Tokens per Sec: 17371, Lr: 0.000300\n", "2020-01-12 13:23:11,220 Epoch 1 Step: 4800 Batch Loss: 2.990680 Tokens per Sec: 17596, Lr: 0.000300\n", "2020-01-12 13:23:25,661 Epoch 1 Step: 4900 Batch Loss: 2.694713 Tokens per Sec: 17590, Lr: 0.000300\n", "2020-01-12 13:23:40,045 Epoch 1 Step: 5000 Batch Loss: 2.639535 Tokens per Sec: 17564, Lr: 0.000300\n", "2020-01-12 13:24:17,438 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:24:17,439 Saving new checkpoint.\n", "2020-01-12 13:24:17,781 Example #0\n", "2020-01-12 13:24:17,782 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:24:17,782 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:24:17,782 \tHypothesis: Kambe , hi fanele hi va ni musa .\n", "2020-01-12 13:24:17,782 Example #1\n", "2020-01-12 13:24:17,782 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:24:17,782 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:24:17,782 \tHypothesis: 3 “ Ku Tlwela Ku Rhandza Ku Rhandza ”\n", "2020-01-12 13:24:17,782 Example #2\n", "2020-01-12 13:24:17,782 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:24:17,782 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:24:17,783 \tHypothesis: Loko hi ri karhi hi tikarhatela ku endla leswaku hi ni matimba , hi fanele hi va ni matimba yo biha , ku nga ku va ni ntshunxeko , ku va ni ntshunxeko , ku va ni ntshunxeko wa xiviri .\n", "2020-01-12 13:24:17,783 Example #3\n", "2020-01-12 13:24:17,783 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:24:17,783 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:24:17,783 \tHypothesis: Ku tinyiketela ka Xikwembu ku tinyiketela loku nga ni ku tinyiketela ka munhu ni ku va mutirheli wa munhu la nga ni vutihlamuleri\n", "2020-01-12 13:24:17,783 Validation result (greedy) at epoch 1, step 5000: bleu: 10.80, loss: 69489.7656, ppl: 10.5183, duration: 37.7377s\n", "2020-01-12 13:24:32,196 Epoch 1 Step: 5100 Batch Loss: 2.766508 Tokens per Sec: 17595, Lr: 0.000300\n", "2020-01-12 13:24:46,517 Epoch 1 Step: 5200 Batch Loss: 2.704270 Tokens per Sec: 17518, Lr: 0.000300\n", "2020-01-12 13:25:00,905 Epoch 1 Step: 5300 Batch Loss: 2.339511 Tokens per Sec: 17013, Lr: 0.000300\n", "2020-01-12 13:25:15,203 Epoch 1 Step: 5400 Batch Loss: 2.615668 Tokens per Sec: 17406, Lr: 0.000300\n", "2020-01-12 13:25:29,596 Epoch 1 Step: 5500 Batch Loss: 2.688394 Tokens per Sec: 17441, Lr: 0.000300\n", "2020-01-12 13:25:43,949 Epoch 1 Step: 5600 Batch Loss: 2.121804 Tokens per Sec: 17479, Lr: 0.000300\n", "2020-01-12 13:25:58,206 Epoch 1 Step: 5700 Batch Loss: 2.260647 Tokens per Sec: 16957, Lr: 0.000300\n", "2020-01-12 13:26:12,552 Epoch 1 Step: 5800 Batch Loss: 2.325348 Tokens per Sec: 17371, Lr: 0.000300\n", "2020-01-12 13:26:26,945 Epoch 1 Step: 5900 Batch Loss: 2.511302 Tokens per Sec: 17334, Lr: 0.000300\n", "2020-01-12 13:26:41,337 Epoch 1 Step: 6000 Batch Loss: 2.423702 Tokens per Sec: 17461, Lr: 0.000300\n", "2020-01-12 13:27:18,491 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:27:18,491 Saving new checkpoint.\n", "2020-01-12 13:27:18,822 Example #0\n", "2020-01-12 13:27:18,823 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:27:18,823 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:27:18,823 \tHypothesis: Kambe , hi lava ku va ni vutshembeki .\n", "2020-01-12 13:27:18,823 Example #1\n", "2020-01-12 13:27:18,823 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:27:18,823 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:27:18,823 \tHypothesis: 3 “ U Nga Ti Tshama ”\n", "2020-01-12 13:27:18,823 Example #2\n", "2020-01-12 13:27:18,823 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:27:18,823 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:27:18,824 \tHypothesis: Ku navela ka hina ku nga khathariseki leswaku hi va ni vusodoma , hi fanele hi va ni vuyelo byo biha bya ku endla leswaku hi va ni ntshunxeko wa vululami , ku nga ri ku va ni vululami .\n", "2020-01-12 13:27:18,824 Example #3\n", "2020-01-12 13:27:18,824 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:27:18,824 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:27:18,824 \tHypothesis: Ku tinyiketela ka Xikwembu ku tinyiketela ka munhu ni ku va mutirheli wa munhu wa munhu la nga ni vutihlamuleri bya yena\n", "2020-01-12 13:27:18,824 Validation result (greedy) at epoch 1, step 6000: bleu: 12.64, loss: 65908.3672, ppl: 9.3170, duration: 37.4867s\n", "2020-01-12 13:27:33,177 Epoch 1 Step: 6100 Batch Loss: 2.395917 Tokens per Sec: 17298, Lr: 0.000300\n", "2020-01-12 13:27:47,550 Epoch 1 Step: 6200 Batch Loss: 2.487905 Tokens per Sec: 17493, Lr: 0.000300\n", "2020-01-12 13:28:02,016 Epoch 1 Step: 6300 Batch Loss: 2.164646 Tokens per Sec: 17475, Lr: 0.000300\n", "2020-01-12 13:28:16,493 Epoch 1 Step: 6400 Batch Loss: 2.144559 Tokens per Sec: 18034, Lr: 0.000300\n", "2020-01-12 13:28:30,891 Epoch 1 Step: 6500 Batch Loss: 2.313017 Tokens per Sec: 17366, Lr: 0.000300\n", "2020-01-12 13:28:45,221 Epoch 1 Step: 6600 Batch Loss: 2.365983 Tokens per Sec: 17410, Lr: 0.000300\n", "2020-01-12 13:28:59,624 Epoch 1 Step: 6700 Batch Loss: 2.002667 Tokens per Sec: 17858, Lr: 0.000300\n", "2020-01-12 13:29:14,006 Epoch 1 Step: 6800 Batch Loss: 2.107777 Tokens per Sec: 17388, Lr: 0.000300\n", "2020-01-12 13:29:28,354 Epoch 1 Step: 6900 Batch Loss: 2.090506 Tokens per Sec: 17377, Lr: 0.000300\n", "2020-01-12 13:29:42,741 Epoch 1 Step: 7000 Batch Loss: 2.375514 Tokens per Sec: 17634, Lr: 0.000300\n", "2020-01-12 13:30:20,093 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:30:20,093 Saving new checkpoint.\n", "2020-01-12 13:30:20,444 Example #0\n", "2020-01-12 13:30:20,444 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:30:20,444 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:30:20,444 \tHypothesis: Kambe , hi fanele hi tikarhatela ku va ni gome .\n", "2020-01-12 13:30:20,444 Example #1\n", "2020-01-12 13:30:20,444 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:30:20,444 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:30:20,445 \tHypothesis: 3 “ Ku Tlhela Ku Tlwela ”\n", "2020-01-12 13:30:20,445 Example #2\n", "2020-01-12 13:30:20,445 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:30:20,445 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:30:20,445 \tHypothesis: Loko hi ringeta ku tikurisa hi ku tirhisa vusodoma , hi fanele hi tikarhatela ku endla matshalatshala yo endla leswaku hi va ni ntshunxeko , ku nga vululami ni vululami , vululami ni vululami .\n", "2020-01-12 13:30:20,445 Example #3\n", "2020-01-12 13:30:20,445 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:30:20,445 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:30:20,445 \tHypothesis: Ku tinyiketela ka vanhu ku tinyiketela ka Xikwembu ni ku tinyiketela ka munhu hi ndlela leyi faneleke ku va mutirheli wa munhu\n", "2020-01-12 13:30:20,445 Validation result (greedy) at epoch 1, step 7000: bleu: 14.45, loss: 62382.4844, ppl: 8.2684, duration: 37.7039s\n", "2020-01-12 13:30:34,862 Epoch 1 Step: 7100 Batch Loss: 1.914406 Tokens per Sec: 17439, Lr: 0.000300\n", "2020-01-12 13:30:49,208 Epoch 1 Step: 7200 Batch Loss: 2.693013 Tokens per Sec: 17272, Lr: 0.000300\n", "2020-01-12 13:31:03,576 Epoch 1 Step: 7300 Batch Loss: 2.513612 Tokens per Sec: 17679, Lr: 0.000300\n", "2020-01-12 13:31:17,913 Epoch 1 Step: 7400 Batch Loss: 2.204787 Tokens per Sec: 17396, Lr: 0.000300\n", "2020-01-12 13:31:32,376 Epoch 1 Step: 7500 Batch Loss: 2.450718 Tokens per Sec: 17600, Lr: 0.000300\n", "2020-01-12 13:31:46,726 Epoch 1 Step: 7600 Batch Loss: 2.173195 Tokens per Sec: 17194, Lr: 0.000300\n", "2020-01-12 13:32:01,153 Epoch 1 Step: 7700 Batch Loss: 2.057719 Tokens per Sec: 17164, Lr: 0.000300\n", "2020-01-12 13:32:15,559 Epoch 1 Step: 7800 Batch Loss: 2.514223 Tokens per Sec: 17411, Lr: 0.000300\n", "2020-01-12 13:32:29,968 Epoch 1 Step: 7900 Batch Loss: 2.234757 Tokens per Sec: 17255, Lr: 0.000300\n", "2020-01-12 13:32:44,314 Epoch 1 Step: 8000 Batch Loss: 2.200519 Tokens per Sec: 17167, Lr: 0.000300\n", "2020-01-12 13:33:21,500 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:33:21,500 Saving new checkpoint.\n", "2020-01-12 13:33:21,838 Example #0\n", "2020-01-12 13:33:21,838 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:33:21,838 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:33:21,838 \tHypothesis: Kambe , hi fanele hi tikarhatela ku va ni gome .\n", "2020-01-12 13:33:21,838 Example #1\n", "2020-01-12 13:33:21,839 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:33:21,839 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:33:21,839 \tHypothesis: 3 “ Ku Hlula U Nga Riki ”\n", "2020-01-12 13:33:21,839 Example #2\n", "2020-01-12 13:33:21,839 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:33:21,839 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:33:21,839 \tHypothesis: Ku chava ka hina ku nga ha va ni vusodoma , hi fanele hi va ni vuyelo byo biha lebyi nga ni matimba , vululami ni vululami .\n", "2020-01-12 13:33:21,839 Example #3\n", "2020-01-12 13:33:21,840 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:33:21,840 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:33:21,840 \tHypothesis: Ku tinyiketela ka Bapalata ku tinyiketela ka Xikwembu ni ku tihlanganisa ni ku va mutirheli wa munhu\n", "2020-01-12 13:33:21,840 Validation result (greedy) at epoch 1, step 8000: bleu: 16.71, loss: 59565.6367, ppl: 7.5162, duration: 37.5254s\n", "2020-01-12 13:33:36,163 Epoch 1 Step: 8100 Batch Loss: 2.161650 Tokens per Sec: 16806, Lr: 0.000300\n", "2020-01-12 13:33:50,582 Epoch 1 Step: 8200 Batch Loss: 2.153435 Tokens per Sec: 17396, Lr: 0.000300\n", "2020-01-12 13:34:04,924 Epoch 1 Step: 8300 Batch Loss: 1.946867 Tokens per Sec: 17414, Lr: 0.000300\n", "2020-01-12 13:34:19,330 Epoch 1 Step: 8400 Batch Loss: 2.293966 Tokens per Sec: 17243, Lr: 0.000300\n", "2020-01-12 13:34:33,650 Epoch 1 Step: 8500 Batch Loss: 2.055956 Tokens per Sec: 17027, Lr: 0.000300\n", "2020-01-12 13:34:48,009 Epoch 1 Step: 8600 Batch Loss: 2.498449 Tokens per Sec: 17646, Lr: 0.000300\n", "2020-01-12 13:35:02,353 Epoch 1 Step: 8700 Batch Loss: 1.859478 Tokens per Sec: 17219, Lr: 0.000300\n", "2020-01-12 13:35:16,679 Epoch 1 Step: 8800 Batch Loss: 2.290185 Tokens per Sec: 17503, Lr: 0.000300\n", "2020-01-12 13:35:31,045 Epoch 1 Step: 8900 Batch Loss: 2.186468 Tokens per Sec: 17321, Lr: 0.000300\n", "2020-01-12 13:35:41,109 Epoch 1: total training loss 25963.28\n", "2020-01-12 13:35:41,110 EPOCH 2\n", "2020-01-12 13:35:46,770 Epoch 2 Step: 9000 Batch Loss: 2.350435 Tokens per Sec: 13841, Lr: 0.000300\n", "2020-01-12 13:36:24,192 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:36:24,192 Saving new checkpoint.\n", "2020-01-12 13:36:24,484 Example #0\n", "2020-01-12 13:36:24,484 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:36:24,485 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:36:24,485 \tHypothesis: Kambe , hi fanele hi va ni xivindzi .\n", "2020-01-12 13:36:24,485 Example #1\n", "2020-01-12 13:36:24,485 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:36:24,485 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:36:24,485 \tHypothesis: 3 “ Ku Lavisisa Ku Pfumela ”\n", "2020-01-12 13:36:24,485 Example #2\n", "2020-01-12 13:36:24,485 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:36:24,485 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:36:24,485 \tHypothesis: Ku chava ka hina ku lwisana ni vusodoma , hi fanele hi va ni matshalatshala yo nyikela ntshunxeko , vululami ni vululami , vululami , vululami ni vululami .\n", "2020-01-12 13:36:24,485 Example #3\n", "2020-01-12 13:36:24,486 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:36:24,486 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:36:24,486 \tHypothesis: Ku tinyiketela loku nga ni vukanganyisi ku tinyiketela eka Xikwembu ni ku va mutirheli wa xona\n", "2020-01-12 13:36:24,486 Validation result (greedy) at epoch 2, step 9000: bleu: 19.01, loss: 56864.4609, ppl: 6.8592, duration: 37.7152s\n", "2020-01-12 13:36:38,913 Epoch 2 Step: 9100 Batch Loss: 1.944754 Tokens per Sec: 17387, Lr: 0.000300\n", "2020-01-12 13:36:53,314 Epoch 2 Step: 9200 Batch Loss: 1.831403 Tokens per Sec: 17773, Lr: 0.000300\n", "2020-01-12 13:37:07,664 Epoch 2 Step: 9300 Batch Loss: 1.983415 Tokens per Sec: 16990, Lr: 0.000300\n", "2020-01-12 13:37:22,029 Epoch 2 Step: 9400 Batch Loss: 1.931301 Tokens per Sec: 17425, Lr: 0.000300\n", "2020-01-12 13:37:36,403 Epoch 2 Step: 9500 Batch Loss: 2.153498 Tokens per Sec: 17126, Lr: 0.000300\n", "2020-01-12 13:37:50,717 Epoch 2 Step: 9600 Batch Loss: 1.914978 Tokens per Sec: 17252, Lr: 0.000300\n", "2020-01-12 13:38:05,103 Epoch 2 Step: 9700 Batch Loss: 2.087522 Tokens per Sec: 17307, Lr: 0.000300\n", "2020-01-12 13:38:19,521 Epoch 2 Step: 9800 Batch Loss: 1.844327 Tokens per Sec: 17838, Lr: 0.000300\n", "2020-01-12 13:38:33,872 Epoch 2 Step: 9900 Batch Loss: 2.139555 Tokens per Sec: 17195, Lr: 0.000300\n", "2020-01-12 13:38:48,253 Epoch 2 Step: 10000 Batch Loss: 1.890193 Tokens per Sec: 17545, Lr: 0.000300\n", "2020-01-12 13:39:25,275 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:39:25,275 Saving new checkpoint.\n", "2020-01-12 13:39:25,597 Example #0\n", "2020-01-12 13:39:25,598 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:39:25,598 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:39:25,598 \tHypothesis: Kambe , hi fanele hi tikarhatela .\n", "2020-01-12 13:39:25,598 Example #1\n", "2020-01-12 13:39:25,598 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:39:25,598 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:39:25,598 \tHypothesis: 3 “ U Nga Hlangi U Fanele U Fambela ”\n", "2020-01-12 13:39:25,598 Example #2\n", "2020-01-12 13:39:25,598 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:39:25,598 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:39:25,599 \tHypothesis: Ku kombisa ku navela ka hina ku lwisana ni vutherorisi , hi fanele hi va ni vuyelo byo biha lebyi nga ni khombo , vululami ni vululami .\n", "2020-01-12 13:39:25,599 Example #3\n", "2020-01-12 13:39:25,599 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:39:25,599 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:39:25,599 \tHypothesis: Ku tinyiketela loku nga ni vukanganyisi ku tinyiketela eka Xikwembu ni ku va mutirheli wa munhu la nga ni mfanelo\n", "2020-01-12 13:39:25,599 Validation result (greedy) at epoch 2, step 10000: bleu: 19.83, loss: 54950.1172, ppl: 6.4286, duration: 37.3461s\n", "2020-01-12 13:39:40,025 Epoch 2 Step: 10100 Batch Loss: 1.942197 Tokens per Sec: 17716, Lr: 0.000300\n", "2020-01-12 13:39:54,368 Epoch 2 Step: 10200 Batch Loss: 2.019771 Tokens per Sec: 17253, Lr: 0.000300\n", "2020-01-12 13:40:08,780 Epoch 2 Step: 10300 Batch Loss: 2.188068 Tokens per Sec: 17430, Lr: 0.000300\n", "2020-01-12 13:40:23,203 Epoch 2 Step: 10400 Batch Loss: 1.882832 Tokens per Sec: 17166, Lr: 0.000300\n", "2020-01-12 13:40:37,658 Epoch 2 Step: 10500 Batch Loss: 1.700441 Tokens per Sec: 17410, Lr: 0.000300\n", "2020-01-12 13:40:52,097 Epoch 2 Step: 10600 Batch Loss: 2.182787 Tokens per Sec: 17588, Lr: 0.000300\n", "2020-01-12 13:41:06,478 Epoch 2 Step: 10700 Batch Loss: 1.912154 Tokens per Sec: 17442, Lr: 0.000300\n", "2020-01-12 13:41:20,769 Epoch 2 Step: 10800 Batch Loss: 1.872577 Tokens per Sec: 17158, Lr: 0.000300\n", "2020-01-12 13:41:35,075 Epoch 2 Step: 10900 Batch Loss: 2.178710 Tokens per Sec: 17138, Lr: 0.000300\n", "2020-01-12 13:41:49,428 Epoch 2 Step: 11000 Batch Loss: 1.750615 Tokens per Sec: 16979, Lr: 0.000300\n", "2020-01-12 13:42:26,817 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:42:26,818 Saving new checkpoint.\n", "2020-01-12 13:42:27,183 Example #0\n", "2020-01-12 13:42:27,183 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:42:27,183 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:42:27,183 \tHypothesis: Kambe , hi fanele hi tikarhatela ku va ni ntshembo .\n", "2020-01-12 13:42:27,183 Example #1\n", "2020-01-12 13:42:27,184 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:42:27,184 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:42:27,184 \tHypothesis: 3 “ Ku Hundzuka ”\n", "2020-01-12 13:42:27,184 Example #2\n", "2020-01-12 13:42:27,184 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:42:27,184 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:42:27,184 \tHypothesis: Ku endla swilo swa hina leswi hi swi endlaka , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , ku nga vululami , vuhomboloki ni vuhomboloki .\n", "2020-01-12 13:42:27,184 Example #3\n", "2020-01-12 13:42:27,185 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:42:27,185 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:42:27,185 \tHypothesis: Ku tinyiketela ka Xikwembu ni ku kombisa ku tinyiketela ka munhu hi xiyexe tanihi mutirheli\n", "2020-01-12 13:42:27,185 Validation result (greedy) at epoch 2, step 11000: bleu: 20.71, loss: 54042.6953, ppl: 6.2341, duration: 37.7569s\n", "2020-01-12 13:42:41,622 Epoch 2 Step: 11100 Batch Loss: 2.078577 Tokens per Sec: 17288, Lr: 0.000300\n", "2020-01-12 13:42:56,035 Epoch 2 Step: 11200 Batch Loss: 2.024283 Tokens per Sec: 17842, Lr: 0.000300\n", "2020-01-12 13:43:10,309 Epoch 2 Step: 11300 Batch Loss: 1.833985 Tokens per Sec: 16862, Lr: 0.000300\n", "2020-01-12 13:43:24,701 Epoch 2 Step: 11400 Batch Loss: 2.070680 Tokens per Sec: 17520, Lr: 0.000300\n", "2020-01-12 13:43:39,127 Epoch 2 Step: 11500 Batch Loss: 2.033121 Tokens per Sec: 17431, Lr: 0.000300\n", "2020-01-12 13:43:53,536 Epoch 2 Step: 11600 Batch Loss: 1.986960 Tokens per Sec: 17593, Lr: 0.000300\n", "2020-01-12 13:44:07,954 Epoch 2 Step: 11700 Batch Loss: 2.017812 Tokens per Sec: 17492, Lr: 0.000300\n", "2020-01-12 13:44:22,299 Epoch 2 Step: 11800 Batch Loss: 2.117501 Tokens per Sec: 17315, Lr: 0.000300\n", "2020-01-12 13:44:36,657 Epoch 2 Step: 11900 Batch Loss: 1.761721 Tokens per Sec: 16942, Lr: 0.000300\n", "2020-01-12 13:44:51,030 Epoch 2 Step: 12000 Batch Loss: 2.120974 Tokens per Sec: 17689, Lr: 0.000300\n", "2020-01-12 13:45:28,230 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:45:28,230 Saving new checkpoint.\n", "2020-01-12 13:45:28,537 Example #0\n", "2020-01-12 13:45:28,538 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:45:28,538 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:45:28,538 \tHypothesis: Kambe , hi fanele hi tikarhatela ku va ni ntshembo .\n", "2020-01-12 13:45:28,539 Example #1\n", "2020-01-12 13:45:28,539 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:45:28,539 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:45:28,539 \tHypothesis: 3 “ Ku Lavisisa U Fambela ”\n", "2020-01-12 13:45:28,539 Example #2\n", "2020-01-12 13:45:28,539 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:45:28,539 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:45:28,539 \tHypothesis: Ku tikarhatela ku herisa vungoma bya hina , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , vululami ni vululami bya vanhu .\n", "2020-01-12 13:45:28,540 Example #3\n", "2020-01-12 13:45:28,540 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:45:28,540 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:45:28,540 \tHypothesis: Ku tinyiketela eka Xikwembu ni ku kombisa ku tinyiketela ka munhu ni ku kombisa ku va mutirheli\n", "2020-01-12 13:45:28,540 Validation result (greedy) at epoch 2, step 12000: bleu: 22.64, loss: 51465.3594, ppl: 5.7131, duration: 37.5103s\n", "2020-01-12 13:45:42,916 Epoch 2 Step: 12100 Batch Loss: 1.926992 Tokens per Sec: 17422, Lr: 0.000300\n", "2020-01-12 13:45:57,328 Epoch 2 Step: 12200 Batch Loss: 1.823325 Tokens per Sec: 17496, Lr: 0.000300\n", "2020-01-12 13:46:11,724 Epoch 2 Step: 12300 Batch Loss: 1.941741 Tokens per Sec: 17416, Lr: 0.000300\n", "2020-01-12 13:46:26,167 Epoch 2 Step: 12400 Batch Loss: 1.826409 Tokens per Sec: 17260, Lr: 0.000300\n", "2020-01-12 13:46:40,531 Epoch 2 Step: 12500 Batch Loss: 1.943573 Tokens per Sec: 17259, Lr: 0.000300\n", "2020-01-12 13:46:54,856 Epoch 2 Step: 12600 Batch Loss: 1.758467 Tokens per Sec: 17548, Lr: 0.000300\n", "2020-01-12 13:47:09,212 Epoch 2 Step: 12700 Batch Loss: 1.916288 Tokens per Sec: 17371, Lr: 0.000300\n", "2020-01-12 13:47:23,513 Epoch 2 Step: 12800 Batch Loss: 2.099286 Tokens per Sec: 17320, Lr: 0.000300\n", "2020-01-12 13:47:37,794 Epoch 2 Step: 12900 Batch Loss: 1.720808 Tokens per Sec: 17394, Lr: 0.000300\n", "2020-01-12 13:47:52,184 Epoch 2 Step: 13000 Batch Loss: 2.086654 Tokens per Sec: 17398, Lr: 0.000300\n", "2020-01-12 13:48:29,513 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:48:29,513 Saving new checkpoint.\n", "2020-01-12 13:48:29,841 Example #0\n", "2020-01-12 13:48:29,841 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:48:29,841 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:48:29,841 \tHypothesis: Kambe , hi ku hatlisa , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 13:48:29,841 Example #1\n", "2020-01-12 13:48:29,841 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:48:29,842 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:48:29,842 \tHypothesis: 3 “ Ku Pfuna Ku Tshika U Ri Ni Nkoka ”\n", "2020-01-12 13:48:29,842 Example #2\n", "2020-01-12 13:48:29,842 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:48:29,842 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:48:29,842 \tHypothesis: Loko hi ri karhi hi tikarhatela ku endla swilo leswi hi nga swi kotaka , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , vululami ni vululami .\n", "2020-01-12 13:48:29,842 Example #3\n", "2020-01-12 13:48:29,843 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:48:29,843 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:48:29,843 \tHypothesis: Vuyelo bya vugevenga byi kombisa ku tinyiketela eka Xikwembu ni ku kombisa ku va mutirheli wa munhu tanihi mutirheli\n", "2020-01-12 13:48:29,843 Validation result (greedy) at epoch 2, step 13000: bleu: 23.32, loss: 50235.1797, ppl: 5.4800, duration: 37.6590s\n", "2020-01-12 13:48:44,330 Epoch 2 Step: 13100 Batch Loss: 1.654059 Tokens per Sec: 17470, Lr: 0.000300\n", "2020-01-12 13:48:58,695 Epoch 2 Step: 13200 Batch Loss: 2.073310 Tokens per Sec: 17570, Lr: 0.000300\n", "2020-01-12 13:49:12,991 Epoch 2 Step: 13300 Batch Loss: 1.889394 Tokens per Sec: 17589, Lr: 0.000300\n", "2020-01-12 13:49:27,311 Epoch 2 Step: 13400 Batch Loss: 1.799801 Tokens per Sec: 17272, Lr: 0.000300\n", "2020-01-12 13:49:41,713 Epoch 2 Step: 13500 Batch Loss: 2.014540 Tokens per Sec: 17519, Lr: 0.000300\n", "2020-01-12 13:49:55,998 Epoch 2 Step: 13600 Batch Loss: 1.900150 Tokens per Sec: 17399, Lr: 0.000300\n", "2020-01-12 13:50:10,340 Epoch 2 Step: 13700 Batch Loss: 1.949418 Tokens per Sec: 17484, Lr: 0.000300\n", "2020-01-12 13:50:24,774 Epoch 2 Step: 13800 Batch Loss: 1.698372 Tokens per Sec: 17783, Lr: 0.000300\n", "2020-01-12 13:50:39,067 Epoch 2 Step: 13900 Batch Loss: 1.724535 Tokens per Sec: 17649, Lr: 0.000300\n", "2020-01-12 13:50:53,437 Epoch 2 Step: 14000 Batch Loss: 2.276809 Tokens per Sec: 17553, Lr: 0.000300\n", "2020-01-12 13:51:30,646 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:51:30,646 Saving new checkpoint.\n", "2020-01-12 13:51:30,960 Example #0\n", "2020-01-12 13:51:30,960 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:51:30,960 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:51:30,960 \tHypothesis: Kambe , hi fanele hi lava ku tiyisela .\n", "2020-01-12 13:51:30,960 Example #1\n", "2020-01-12 13:51:30,960 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:51:30,960 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:51:30,960 \tHypothesis: 3 “ Ku Tirhisa Ku Tirha ”\n", "2020-01-12 13:51:30,961 Example #2\n", "2020-01-12 13:51:30,961 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:51:30,961 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:51:30,961 \tHypothesis: Ku tikurisa ka hina ku lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , vululami , vululami ni vululami .\n", "2020-01-12 13:51:30,961 Example #3\n", "2020-01-12 13:51:30,961 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:51:30,961 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:51:30,961 \tHypothesis: Ku tinyiketela eka Xikwembu ni ku kombisa ku tinyiketela ka munhu ku fana ni mutirheli\n", "2020-01-12 13:51:30,961 Validation result (greedy) at epoch 2, step 14000: bleu: 24.47, loss: 48921.6953, ppl: 5.2416, duration: 37.5242s\n", "2020-01-12 13:51:45,392 Epoch 2 Step: 14100 Batch Loss: 2.248385 Tokens per Sec: 17366, Lr: 0.000300\n", "2020-01-12 13:51:59,757 Epoch 2 Step: 14200 Batch Loss: 1.832864 Tokens per Sec: 17702, Lr: 0.000300\n", "2020-01-12 13:52:14,116 Epoch 2 Step: 14300 Batch Loss: 1.649870 Tokens per Sec: 17520, Lr: 0.000300\n", "2020-01-12 13:52:28,538 Epoch 2 Step: 14400 Batch Loss: 1.640168 Tokens per Sec: 17626, Lr: 0.000300\n", "2020-01-12 13:52:42,951 Epoch 2 Step: 14500 Batch Loss: 1.792805 Tokens per Sec: 17643, Lr: 0.000300\n", "2020-01-12 13:52:57,328 Epoch 2 Step: 14600 Batch Loss: 1.971055 Tokens per Sec: 17520, Lr: 0.000300\n", "2020-01-12 13:53:11,658 Epoch 2 Step: 14700 Batch Loss: 2.086929 Tokens per Sec: 17698, Lr: 0.000300\n", "2020-01-12 13:53:26,060 Epoch 2 Step: 14800 Batch Loss: 2.113154 Tokens per Sec: 17557, Lr: 0.000300\n", "2020-01-12 13:53:40,409 Epoch 2 Step: 14900 Batch Loss: 1.603640 Tokens per Sec: 17496, Lr: 0.000300\n", "2020-01-12 13:53:54,784 Epoch 2 Step: 15000 Batch Loss: 1.678053 Tokens per Sec: 17334, Lr: 0.000300\n", "2020-01-12 13:54:32,014 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:54:32,014 Saving new checkpoint.\n", "2020-01-12 13:54:32,324 Example #0\n", "2020-01-12 13:54:32,325 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:54:32,325 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:54:32,325 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 13:54:32,325 Example #1\n", "2020-01-12 13:54:32,325 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:54:32,325 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:54:32,325 \tHypothesis: 3 “ Mi Nga Ku Kombisa ”\n", "2020-01-12 13:54:32,326 Example #2\n", "2020-01-12 13:54:32,326 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:54:32,326 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:54:32,326 \tHypothesis: Loko hi ri karhi hi anakanyisisa hi swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , vululami , vululami ni vululami .\n", "2020-01-12 13:54:32,326 Example #3\n", "2020-01-12 13:54:32,326 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:54:32,326 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:54:32,327 \tHypothesis: Ku tinyiketela loku nga fanelangiki ku tinyiketela eka Xikwembu ni ku kombisa ku va mutirheli\n", "2020-01-12 13:54:32,327 Validation result (greedy) at epoch 2, step 15000: bleu: 24.90, loss: 48019.0469, ppl: 5.0838, duration: 37.5421s\n", "2020-01-12 13:54:46,684 Epoch 2 Step: 15100 Batch Loss: 1.982115 Tokens per Sec: 17320, Lr: 0.000300\n", "2020-01-12 13:55:01,045 Epoch 2 Step: 15200 Batch Loss: 1.716894 Tokens per Sec: 17438, Lr: 0.000300\n", "2020-01-12 13:55:15,388 Epoch 2 Step: 15300 Batch Loss: 2.315532 Tokens per Sec: 17680, Lr: 0.000300\n", "2020-01-12 13:55:29,773 Epoch 2 Step: 15400 Batch Loss: 1.734112 Tokens per Sec: 17347, Lr: 0.000300\n", "2020-01-12 13:55:44,097 Epoch 2 Step: 15500 Batch Loss: 1.817530 Tokens per Sec: 17257, Lr: 0.000300\n", "2020-01-12 13:55:58,509 Epoch 2 Step: 15600 Batch Loss: 1.819745 Tokens per Sec: 17609, Lr: 0.000300\n", "2020-01-12 13:56:12,840 Epoch 2 Step: 15700 Batch Loss: 1.988664 Tokens per Sec: 17134, Lr: 0.000300\n", "2020-01-12 13:56:27,235 Epoch 2 Step: 15800 Batch Loss: 1.603447 Tokens per Sec: 17467, Lr: 0.000300\n", "2020-01-12 13:56:41,522 Epoch 2 Step: 15900 Batch Loss: 1.850164 Tokens per Sec: 17250, Lr: 0.000300\n", "2020-01-12 13:56:55,888 Epoch 2 Step: 16000 Batch Loss: 1.718192 Tokens per Sec: 17741, Lr: 0.000300\n", "2020-01-12 13:57:33,181 Hooray! New best validation result [ppl]!\n", "2020-01-12 13:57:33,181 Saving new checkpoint.\n", "2020-01-12 13:57:33,477 Example #0\n", "2020-01-12 13:57:33,477 \tSource: But , naturally , we also want relief .\n", "2020-01-12 13:57:33,477 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 13:57:33,477 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 13:57:33,477 Example #1\n", "2020-01-12 13:57:33,477 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 13:57:33,478 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 13:57:33,478 \tHypothesis: 3 “ Mi Hundzuka Mi Va N’wina ”\n", "2020-01-12 13:57:33,478 Example #2\n", "2020-01-12 13:57:33,478 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 13:57:33,478 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 13:57:33,478 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , vululami ni vululami .\n", "2020-01-12 13:57:33,478 Example #3\n", "2020-01-12 13:57:33,478 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 13:57:33,478 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 13:57:33,478 \tHypothesis: Ku tinyiketela eka Xikwembu ni ku kombisa ku tinyiketela ka munhu tanihi mutirheli\n", "2020-01-12 13:57:33,479 Validation result (greedy) at epoch 2, step 16000: bleu: 25.99, loss: 47117.4023, ppl: 4.9309, duration: 37.5902s\n", "2020-01-12 13:57:47,949 Epoch 2 Step: 16100 Batch Loss: 1.430098 Tokens per Sec: 17563, Lr: 0.000300\n", "2020-01-12 13:58:02,275 Epoch 2 Step: 16200 Batch Loss: 1.626592 Tokens per Sec: 17320, Lr: 0.000300\n", "2020-01-12 13:58:16,641 Epoch 2 Step: 16300 Batch Loss: 1.687197 Tokens per Sec: 17250, Lr: 0.000300\n", "2020-01-12 13:58:30,971 Epoch 2 Step: 16400 Batch Loss: 1.953417 Tokens per Sec: 17345, Lr: 0.000300\n", "2020-01-12 13:58:45,283 Epoch 2 Step: 16500 Batch Loss: 1.832766 Tokens per Sec: 17240, Lr: 0.000300\n", "2020-01-12 13:58:59,723 Epoch 2 Step: 16600 Batch Loss: 1.806802 Tokens per Sec: 17561, Lr: 0.000300\n", "2020-01-12 13:59:14,193 Epoch 2 Step: 16700 Batch Loss: 1.631533 Tokens per Sec: 17514, Lr: 0.000300\n", "2020-01-12 13:59:28,567 Epoch 2 Step: 16800 Batch Loss: 1.851936 Tokens per Sec: 17599, Lr: 0.000300\n", "2020-01-12 13:59:42,989 Epoch 2 Step: 16900 Batch Loss: 1.782361 Tokens per Sec: 17269, Lr: 0.000300\n", "2020-01-12 13:59:57,361 Epoch 2 Step: 17000 Batch Loss: 1.872059 Tokens per Sec: 17118, Lr: 0.000300\n", "2020-01-12 14:00:34,684 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:00:34,684 Saving new checkpoint.\n", "2020-01-12 14:00:34,977 Example #0\n", "2020-01-12 14:00:34,977 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:00:34,977 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:00:34,977 \tHypothesis: Kambe , hi ku hatlisa hi lava ku ntshunxiwa .\n", "2020-01-12 14:00:34,977 Example #1\n", "2020-01-12 14:00:34,977 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:00:34,978 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:00:34,978 \tHypothesis: 3 “ Mi Twisisa ”\n", "2020-01-12 14:00:34,978 Example #2\n", "2020-01-12 14:00:34,978 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:00:34,978 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:00:34,978 \tHypothesis: Loko hi ri karhi hi anakanyisisa hi ku tikukumuxa ka hina , hi fanele hi va ni matshalatshala yo kuma ntshunxeko , vuhomboloki , vuhomboloki ni ya vanhu .\n", "2020-01-12 14:00:34,978 Example #3\n", "2020-01-12 14:00:34,979 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:00:34,979 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:00:34,979 \tHypothesis: Ku tinyiketela ka yena ku kombisa ku tinyiketela eka Xikwembu ni ku kombisa ku va mutirheli wa munhu\n", "2020-01-12 14:00:34,979 Validation result (greedy) at epoch 2, step 17000: bleu: 26.51, loss: 46240.2891, ppl: 4.7866, duration: 37.6179s\n", "2020-01-12 14:00:49,329 Epoch 2 Step: 17100 Batch Loss: 1.627373 Tokens per Sec: 17510, Lr: 0.000300\n", "2020-01-12 14:01:03,720 Epoch 2 Step: 17200 Batch Loss: 1.892281 Tokens per Sec: 17571, Lr: 0.000300\n", "2020-01-12 14:01:18,024 Epoch 2 Step: 17300 Batch Loss: 1.803643 Tokens per Sec: 17060, Lr: 0.000300\n", "2020-01-12 14:01:32,407 Epoch 2 Step: 17400 Batch Loss: 1.903627 Tokens per Sec: 17230, Lr: 0.000300\n", "2020-01-12 14:01:46,761 Epoch 2 Step: 17500 Batch Loss: 1.591697 Tokens per Sec: 17688, Lr: 0.000300\n", "2020-01-12 14:02:01,205 Epoch 2 Step: 17600 Batch Loss: 1.660056 Tokens per Sec: 17556, Lr: 0.000300\n", "2020-01-12 14:02:15,520 Epoch 2 Step: 17700 Batch Loss: 1.748446 Tokens per Sec: 17294, Lr: 0.000300\n", "2020-01-12 14:02:29,980 Epoch 2 Step: 17800 Batch Loss: 1.890546 Tokens per Sec: 17380, Lr: 0.000300\n", "2020-01-12 14:02:44,381 Epoch 2 Step: 17900 Batch Loss: 1.517990 Tokens per Sec: 17724, Lr: 0.000300\n", "2020-01-12 14:02:49,449 Epoch 2: total training loss 16977.22\n", "2020-01-12 14:02:49,450 EPOCH 3\n", "2020-01-12 14:02:59,966 Epoch 3 Step: 18000 Batch Loss: 1.680860 Tokens per Sec: 15441, Lr: 0.000300\n", "2020-01-12 14:03:37,215 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:03:37,216 Saving new checkpoint.\n", "2020-01-12 14:03:37,578 Example #0\n", "2020-01-12 14:03:37,578 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:03:37,578 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:03:37,578 \tHypothesis: Kambe , hi ku hatlisa hi lava ku ntshunxiwa .\n", "2020-01-12 14:03:37,578 Example #1\n", "2020-01-12 14:03:37,579 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:03:37,579 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:03:37,579 \tHypothesis: 3 “ Ku Tiyisela Ku Endla ”\n", "2020-01-12 14:03:37,579 Example #2\n", "2020-01-12 14:03:37,579 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:03:37,579 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:03:37,579 \tHypothesis: Loko hi anakanya hi swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lamakulu yo kuma ntshunxeko , vululami , vululami ni ya vanhu .\n", "2020-01-12 14:03:37,579 Example #3\n", "2020-01-12 14:03:37,579 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:03:37,579 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:03:37,580 \tHypothesis: Vuhundzuluxeri bya Baptisele byi kombisa ku tinyiketela eka Xikwembu ni ku teka ku ri mutirheli wa xona tanihi mutirheli\n", "2020-01-12 14:03:37,580 Validation result (greedy) at epoch 3, step 18000: bleu: 26.79, loss: 45468.5664, ppl: 4.6631, duration: 37.6138s\n", "2020-01-12 14:03:52,103 Epoch 3 Step: 18100 Batch Loss: 1.696411 Tokens per Sec: 17590, Lr: 0.000300\n", "2020-01-12 14:04:06,496 Epoch 3 Step: 18200 Batch Loss: 1.971689 Tokens per Sec: 17532, Lr: 0.000300\n", "2020-01-12 14:04:20,854 Epoch 3 Step: 18300 Batch Loss: 1.871724 Tokens per Sec: 17472, Lr: 0.000300\n", "2020-01-12 14:04:35,239 Epoch 3 Step: 18400 Batch Loss: 1.680677 Tokens per Sec: 17268, Lr: 0.000300\n", "2020-01-12 14:04:49,579 Epoch 3 Step: 18500 Batch Loss: 1.825892 Tokens per Sec: 17764, Lr: 0.000300\n", "2020-01-12 14:05:03,976 Epoch 3 Step: 18600 Batch Loss: 1.500213 Tokens per Sec: 17280, Lr: 0.000300\n", "2020-01-12 14:05:18,383 Epoch 3 Step: 18700 Batch Loss: 1.805880 Tokens per Sec: 17434, Lr: 0.000300\n", "2020-01-12 14:05:32,818 Epoch 3 Step: 18800 Batch Loss: 1.809355 Tokens per Sec: 17461, Lr: 0.000300\n", "2020-01-12 14:05:47,258 Epoch 3 Step: 18900 Batch Loss: 1.520123 Tokens per Sec: 17494, Lr: 0.000300\n", "2020-01-12 14:06:01,690 Epoch 3 Step: 19000 Batch Loss: 1.923366 Tokens per Sec: 17450, Lr: 0.000300\n", "2020-01-12 14:06:38,907 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:06:38,907 Saving new checkpoint.\n", "2020-01-12 14:06:39,213 Example #0\n", "2020-01-12 14:06:39,214 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:06:39,214 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:06:39,214 \tHypothesis: Kambe , hi ku hatlisa hi lava ku ntshunxiwa .\n", "2020-01-12 14:06:39,214 Example #1\n", "2020-01-12 14:06:39,214 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:06:39,214 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:06:39,215 \tHypothesis: 3 “ Mi Nga Ku Tirhisa ”\n", "2020-01-12 14:06:39,215 Example #2\n", "2020-01-12 14:06:39,215 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:06:39,215 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:06:39,216 \tHypothesis: Ku kombisa swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lamakulu yo herisa ntshunxeko , vululami , vululami ni ya vanhu .\n", "2020-01-12 14:06:39,216 Example #3\n", "2020-01-12 14:06:39,216 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:06:39,216 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:06:39,216 \tHypothesis: Vuhundzuluxeri byi kombisa ku tinyiketela eka Xikwembu ni ku kombisa ku tinyiketela ka munhu tanihi mutirheli\n", "2020-01-12 14:06:39,217 Validation result (greedy) at epoch 3, step 19000: bleu: 27.01, loss: 45205.8320, ppl: 4.6218, duration: 37.5262s\n", "2020-01-12 14:06:53,549 Epoch 3 Step: 19100 Batch Loss: 1.640737 Tokens per Sec: 17407, Lr: 0.000300\n", "2020-01-12 14:07:07,927 Epoch 3 Step: 19200 Batch Loss: 1.689480 Tokens per Sec: 17277, Lr: 0.000300\n", "2020-01-12 14:07:22,385 Epoch 3 Step: 19300 Batch Loss: 1.798177 Tokens per Sec: 17710, Lr: 0.000300\n", "2020-01-12 14:07:36,718 Epoch 3 Step: 19400 Batch Loss: 1.663386 Tokens per Sec: 17536, Lr: 0.000300\n", "2020-01-12 14:07:51,175 Epoch 3 Step: 19500 Batch Loss: 1.809300 Tokens per Sec: 17520, Lr: 0.000300\n", "2020-01-12 14:08:05,595 Epoch 3 Step: 19600 Batch Loss: 1.504117 Tokens per Sec: 16924, Lr: 0.000300\n", "2020-01-12 14:08:19,977 Epoch 3 Step: 19700 Batch Loss: 1.670977 Tokens per Sec: 17515, Lr: 0.000300\n", "2020-01-12 14:08:34,261 Epoch 3 Step: 19800 Batch Loss: 1.478601 Tokens per Sec: 17229, Lr: 0.000300\n", "2020-01-12 14:08:48,637 Epoch 3 Step: 19900 Batch Loss: 1.918489 Tokens per Sec: 17547, Lr: 0.000300\n", "2020-01-12 14:09:02,921 Epoch 3 Step: 20000 Batch Loss: 1.708056 Tokens per Sec: 17200, Lr: 0.000300\n", "2020-01-12 14:09:40,115 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:09:40,115 Saving new checkpoint.\n", "2020-01-12 14:09:40,470 Example #0\n", "2020-01-12 14:09:40,471 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:09:40,471 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:09:40,471 \tHypothesis: Kambe , hi fanele hi va ni ntshembo .\n", "2020-01-12 14:09:40,471 Example #1\n", "2020-01-12 14:09:40,471 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:09:40,471 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:09:40,472 \tHypothesis: 3 “ Mi Fanele Mi Tlhela ”\n", "2020-01-12 14:09:40,472 Example #2\n", "2020-01-12 14:09:40,472 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:09:40,472 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:09:40,472 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama hlamarisaka yo kuma ntshunxeko , vuhomboloki , vululami ni vululami .\n", "2020-01-12 14:09:40,472 Example #3\n", "2020-01-12 14:09:40,472 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:09:40,473 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:09:40,473 \tHypothesis: Vuprofeta byi kombisa ku tinyiketela eka Xikwembu ni ku kombisa ku va mutirheli wa munhu tanihi mutirheli\n", "2020-01-12 14:09:40,473 Validation result (greedy) at epoch 3, step 20000: bleu: 27.40, loss: 44536.9688, ppl: 4.5183, duration: 37.5513s\n", "2020-01-12 14:09:54,812 Epoch 3 Step: 20100 Batch Loss: 1.983732 Tokens per Sec: 17503, Lr: 0.000300\n", "2020-01-12 14:10:09,179 Epoch 3 Step: 20200 Batch Loss: 1.548101 Tokens per Sec: 17114, Lr: 0.000300\n", "2020-01-12 14:10:23,616 Epoch 3 Step: 20300 Batch Loss: 1.633597 Tokens per Sec: 17943, Lr: 0.000300\n", "2020-01-12 14:10:37,936 Epoch 3 Step: 20400 Batch Loss: 1.572507 Tokens per Sec: 17766, Lr: 0.000300\n", "2020-01-12 14:10:52,236 Epoch 3 Step: 20500 Batch Loss: 1.430011 Tokens per Sec: 17457, Lr: 0.000300\n", "2020-01-12 14:11:06,569 Epoch 3 Step: 20600 Batch Loss: 1.409915 Tokens per Sec: 17322, Lr: 0.000300\n", "2020-01-12 14:11:20,958 Epoch 3 Step: 20700 Batch Loss: 1.867363 Tokens per Sec: 17247, Lr: 0.000300\n", "2020-01-12 14:11:35,382 Epoch 3 Step: 20800 Batch Loss: 1.462609 Tokens per Sec: 17704, Lr: 0.000300\n", "2020-01-12 14:11:49,779 Epoch 3 Step: 20900 Batch Loss: 1.478946 Tokens per Sec: 17494, Lr: 0.000300\n", "2020-01-12 14:12:04,158 Epoch 3 Step: 21000 Batch Loss: 1.379816 Tokens per Sec: 17509, Lr: 0.000300\n", "2020-01-12 14:12:41,227 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:12:41,227 Saving new checkpoint.\n", "2020-01-12 14:12:41,523 Example #0\n", "2020-01-12 14:12:41,523 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:12:41,523 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:12:41,524 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 14:12:41,524 Example #1\n", "2020-01-12 14:12:41,524 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:12:41,524 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:12:41,524 \tHypothesis: 3 “ Mi Fanele Mi Va Ni Ntirho ”\n", "2020-01-12 14:12:41,524 Example #2\n", "2020-01-12 14:12:41,524 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:12:41,525 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:12:41,525 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lama hlamarisaka yo kuma ntshunxeko , ku nga vululami ni ku va ni mahanyelo ya vanhu .\n", "2020-01-12 14:12:41,525 Example #3\n", "2020-01-12 14:12:41,525 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:12:41,525 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:12:41,525 \tHypothesis: Ku tinyiketela eka Xikwembu ni ku kunguhata ku va mutirheli wa munhu hi laha ku heleleke\n", "2020-01-12 14:12:41,525 Validation result (greedy) at epoch 3, step 21000: bleu: 28.46, loss: 43765.8984, ppl: 4.4019, duration: 37.3670s\n", "2020-01-12 14:12:55,897 Epoch 3 Step: 21100 Batch Loss: 1.685351 Tokens per Sec: 17645, Lr: 0.000300\n", "2020-01-12 14:13:10,169 Epoch 3 Step: 21200 Batch Loss: 1.476081 Tokens per Sec: 17388, Lr: 0.000300\n", "2020-01-12 14:13:24,472 Epoch 3 Step: 21300 Batch Loss: 1.469918 Tokens per Sec: 17518, Lr: 0.000300\n", "2020-01-12 14:13:38,789 Epoch 3 Step: 21400 Batch Loss: 1.636765 Tokens per Sec: 17304, Lr: 0.000300\n", "2020-01-12 14:13:53,169 Epoch 3 Step: 21500 Batch Loss: 2.005800 Tokens per Sec: 17433, Lr: 0.000300\n", "2020-01-12 14:14:07,515 Epoch 3 Step: 21600 Batch Loss: 1.744313 Tokens per Sec: 17393, Lr: 0.000300\n", "2020-01-12 14:14:21,926 Epoch 3 Step: 21700 Batch Loss: 1.649488 Tokens per Sec: 17509, Lr: 0.000300\n", "2020-01-12 14:14:36,257 Epoch 3 Step: 21800 Batch Loss: 1.635584 Tokens per Sec: 17490, Lr: 0.000300\n", "2020-01-12 14:14:50,675 Epoch 3 Step: 21900 Batch Loss: 2.049327 Tokens per Sec: 17219, Lr: 0.000300\n", "2020-01-12 14:15:05,022 Epoch 3 Step: 22000 Batch Loss: 1.748790 Tokens per Sec: 17410, Lr: 0.000300\n", "2020-01-12 14:15:42,201 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:15:42,202 Saving new checkpoint.\n", "2020-01-12 14:15:42,554 Example #0\n", "2020-01-12 14:15:42,555 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:15:42,555 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:15:42,555 \tHypothesis: Kambe , hi ntumbuluko hi lava ku ntshunxiwa .\n", "2020-01-12 14:15:42,555 Example #1\n", "2020-01-12 14:15:42,555 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:15:42,555 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:15:42,555 \tHypothesis: 3 “ Mi Nga Ku Lavisisa ”\n", "2020-01-12 14:15:42,555 Example #2\n", "2020-01-12 14:15:42,555 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:15:42,556 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:15:42,556 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama hlamarisaka yo nyikela ntshunxeko , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:15:42,556 Example #3\n", "2020-01-12 14:15:42,556 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:15:42,556 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:15:42,556 \tHypothesis: Ku tinyiketela eka Xikwembu ni ku kombisa ku tinyiketela ka munhu tanihi mutirheli\n", "2020-01-12 14:15:42,556 Validation result (greedy) at epoch 3, step 22000: bleu: 28.56, loss: 43220.5078, ppl: 4.3213, duration: 37.5337s\n", "2020-01-12 14:15:57,082 Epoch 3 Step: 22100 Batch Loss: 1.539314 Tokens per Sec: 17363, Lr: 0.000300\n", "2020-01-12 14:16:11,470 Epoch 3 Step: 22200 Batch Loss: 1.426459 Tokens per Sec: 17602, Lr: 0.000300\n", "2020-01-12 14:16:25,854 Epoch 3 Step: 22300 Batch Loss: 1.607926 Tokens per Sec: 16959, Lr: 0.000300\n", "2020-01-12 14:16:40,254 Epoch 3 Step: 22400 Batch Loss: 1.563157 Tokens per Sec: 17555, Lr: 0.000300\n", "2020-01-12 14:16:54,671 Epoch 3 Step: 22500 Batch Loss: 1.514071 Tokens per Sec: 17421, Lr: 0.000300\n", "2020-01-12 14:17:09,001 Epoch 3 Step: 22600 Batch Loss: 1.620549 Tokens per Sec: 17656, Lr: 0.000300\n", "2020-01-12 14:17:23,403 Epoch 3 Step: 22700 Batch Loss: 1.529656 Tokens per Sec: 17297, Lr: 0.000300\n", "2020-01-12 14:17:37,768 Epoch 3 Step: 22800 Batch Loss: 1.576639 Tokens per Sec: 17896, Lr: 0.000300\n", "2020-01-12 14:17:52,135 Epoch 3 Step: 22900 Batch Loss: 1.859614 Tokens per Sec: 17322, Lr: 0.000300\n", "2020-01-12 14:18:06,629 Epoch 3 Step: 23000 Batch Loss: 1.670174 Tokens per Sec: 17576, Lr: 0.000300\n", "2020-01-12 14:18:43,801 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:18:43,802 Saving new checkpoint.\n", "2020-01-12 14:18:44,186 Example #0\n", "2020-01-12 14:18:44,186 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:18:44,186 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:18:44,186 \tHypothesis: Kambe , hi ku hatlisa , hi lava ku ntshunxiwa .\n", "2020-01-12 14:18:44,186 Example #1\n", "2020-01-12 14:18:44,186 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:18:44,187 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:18:44,187 \tHypothesis: 3 “ Mi Nga Ku Pfumela ”\n", "2020-01-12 14:18:44,187 Example #2\n", "2020-01-12 14:18:44,187 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:18:44,187 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:18:44,187 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lamakulu yo kuma ntshunxeko , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:18:44,187 Example #3\n", "2020-01-12 14:18:44,187 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:18:44,188 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:18:44,188 \tHypothesis: Vuyelo byi kombisa ku tinyiketela eka Xikwembu ni ku kombisa ku va mutirheli wa xona tanihi mutirheli\n", "2020-01-12 14:18:44,188 Validation result (greedy) at epoch 3, step 23000: bleu: 29.41, loss: 42786.5859, ppl: 4.2583, duration: 37.5583s\n", "2020-01-12 14:18:58,492 Epoch 3 Step: 23100 Batch Loss: 1.597456 Tokens per Sec: 17559, Lr: 0.000300\n", "2020-01-12 14:19:12,839 Epoch 3 Step: 23200 Batch Loss: 1.787043 Tokens per Sec: 17342, Lr: 0.000300\n", "2020-01-12 14:19:27,157 Epoch 3 Step: 23300 Batch Loss: 1.424187 Tokens per Sec: 17277, Lr: 0.000300\n", "2020-01-12 14:19:41,459 Epoch 3 Step: 23400 Batch Loss: 1.823428 Tokens per Sec: 17494, Lr: 0.000300\n", "2020-01-12 14:19:55,755 Epoch 3 Step: 23500 Batch Loss: 1.763729 Tokens per Sec: 17283, Lr: 0.000300\n", "2020-01-12 14:20:10,067 Epoch 3 Step: 23600 Batch Loss: 1.872282 Tokens per Sec: 17200, Lr: 0.000300\n", "2020-01-12 14:20:24,499 Epoch 3 Step: 23700 Batch Loss: 1.675861 Tokens per Sec: 17418, Lr: 0.000300\n", "2020-01-12 14:20:38,815 Epoch 3 Step: 23800 Batch Loss: 1.491661 Tokens per Sec: 17364, Lr: 0.000300\n", "2020-01-12 14:20:53,152 Epoch 3 Step: 23900 Batch Loss: 1.486850 Tokens per Sec: 17301, Lr: 0.000300\n", "2020-01-12 14:21:07,492 Epoch 3 Step: 24000 Batch Loss: 1.553474 Tokens per Sec: 17209, Lr: 0.000300\n", "2020-01-12 14:21:44,733 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:21:44,733 Saving new checkpoint.\n", "2020-01-12 14:21:45,085 Example #0\n", "2020-01-12 14:21:45,086 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:21:45,086 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:21:45,086 \tHypothesis: Kambe , hi ku hatlisa hi lava ku ntshunxiwa .\n", "2020-01-12 14:21:45,086 Example #1\n", "2020-01-12 14:21:45,086 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:21:45,086 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:21:45,086 \tHypothesis: 3 “ Hi Ku Lavisisa ”\n", "2020-01-12 14:21:45,087 Example #2\n", "2020-01-12 14:21:45,087 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:21:45,087 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:21:45,087 \tHypothesis: Ku kombisa swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lamakulu yo kuma ntshunxeko , ku karhateka , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:21:45,087 Example #3\n", "2020-01-12 14:21:45,087 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:21:45,087 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:21:45,087 \tHypothesis: Vutirheli byi kombisa ku tinyiketela eka Xikwembu ni ku hlawula ku va mutirheli wa xona tanihi mutirheli\n", "2020-01-12 14:21:45,088 Validation result (greedy) at epoch 3, step 24000: bleu: 29.08, loss: 42366.4531, ppl: 4.1981, duration: 37.5956s\n", "2020-01-12 14:21:59,521 Epoch 3 Step: 24100 Batch Loss: 1.729798 Tokens per Sec: 17354, Lr: 0.000300\n", "2020-01-12 14:22:13,940 Epoch 3 Step: 24200 Batch Loss: 1.394941 Tokens per Sec: 17699, Lr: 0.000300\n", "2020-01-12 14:22:28,276 Epoch 3 Step: 24300 Batch Loss: 1.567867 Tokens per Sec: 17585, Lr: 0.000300\n", "2020-01-12 14:22:42,665 Epoch 3 Step: 24400 Batch Loss: 1.698862 Tokens per Sec: 17571, Lr: 0.000300\n", "2020-01-12 14:22:57,013 Epoch 3 Step: 24500 Batch Loss: 1.534858 Tokens per Sec: 17157, Lr: 0.000300\n", "2020-01-12 14:23:11,327 Epoch 3 Step: 24600 Batch Loss: 1.691472 Tokens per Sec: 17066, Lr: 0.000300\n", "2020-01-12 14:23:25,760 Epoch 3 Step: 24700 Batch Loss: 1.677472 Tokens per Sec: 17392, Lr: 0.000300\n", "2020-01-12 14:23:40,137 Epoch 3 Step: 24800 Batch Loss: 1.565820 Tokens per Sec: 17679, Lr: 0.000300\n", "2020-01-12 14:23:54,495 Epoch 3 Step: 24900 Batch Loss: 1.493876 Tokens per Sec: 17314, Lr: 0.000300\n", "2020-01-12 14:24:08,879 Epoch 3 Step: 25000 Batch Loss: 1.965742 Tokens per Sec: 17412, Lr: 0.000300\n", "2020-01-12 14:24:46,100 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:24:46,101 Saving new checkpoint.\n", "2020-01-12 14:24:46,403 Example #0\n", "2020-01-12 14:24:46,403 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:24:46,403 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:24:46,404 \tHypothesis: Kambe , hi ku olova , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 14:24:46,404 Example #1\n", "2020-01-12 14:24:46,404 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:24:46,404 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:24:46,404 \tHypothesis: 3 “ Mi Nga Ku Pfuna ”\n", "2020-01-12 14:24:46,404 Example #2\n", "2020-01-12 14:24:46,404 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:24:46,404 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:24:46,404 \tHypothesis: Loko hi endla swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lamakulu yo herisa ntshunxeko , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:24:46,404 Example #3\n", "2020-01-12 14:24:46,405 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:24:46,405 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:24:46,405 \tHypothesis: Vutirheli byi fanekisela ku tinyiketela eka Xikwembu ni ku teka xiboho xa munhu tanihi mutirheli\n", "2020-01-12 14:24:46,405 Validation result (greedy) at epoch 3, step 25000: bleu: 29.84, loss: 41729.5820, ppl: 4.1086, duration: 37.5254s\n", "2020-01-12 14:25:00,781 Epoch 3 Step: 25100 Batch Loss: 1.565694 Tokens per Sec: 17385, Lr: 0.000300\n", "2020-01-12 14:25:15,079 Epoch 3 Step: 25200 Batch Loss: 1.311025 Tokens per Sec: 17533, Lr: 0.000300\n", "2020-01-12 14:25:29,508 Epoch 3 Step: 25300 Batch Loss: 1.522955 Tokens per Sec: 17645, Lr: 0.000300\n", "2020-01-12 14:25:43,939 Epoch 3 Step: 25400 Batch Loss: 1.723812 Tokens per Sec: 17734, Lr: 0.000300\n", "2020-01-12 14:25:58,312 Epoch 3 Step: 25500 Batch Loss: 1.443171 Tokens per Sec: 17117, Lr: 0.000300\n", "2020-01-12 14:26:12,745 Epoch 3 Step: 25600 Batch Loss: 1.468349 Tokens per Sec: 17360, Lr: 0.000300\n", "2020-01-12 14:26:27,052 Epoch 3 Step: 25700 Batch Loss: 1.530365 Tokens per Sec: 17259, Lr: 0.000300\n", "2020-01-12 14:26:41,341 Epoch 3 Step: 25800 Batch Loss: 1.418025 Tokens per Sec: 16808, Lr: 0.000300\n", "2020-01-12 14:26:55,745 Epoch 3 Step: 25900 Batch Loss: 1.815760 Tokens per Sec: 17317, Lr: 0.000300\n", "2020-01-12 14:27:10,093 Epoch 3 Step: 26000 Batch Loss: 1.854960 Tokens per Sec: 17368, Lr: 0.000300\n", "2020-01-12 14:27:47,236 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:27:47,236 Saving new checkpoint.\n", "2020-01-12 14:27:47,538 Example #0\n", "2020-01-12 14:27:47,538 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:27:47,539 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:27:47,539 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 14:27:47,539 Example #1\n", "2020-01-12 14:27:47,539 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:27:47,539 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:27:47,539 \tHypothesis: 3 “ Mi Nga Ku Pfumela ”\n", "2020-01-12 14:27:47,539 Example #2\n", "2020-01-12 14:27:47,539 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:27:47,539 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:27:47,539 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lamakulu yo kuma ntshunxeko , ku va ni vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:27:47,540 Example #3\n", "2020-01-12 14:27:47,540 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:27:47,540 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:27:47,540 \tHypothesis: Baptism u kombisa ku tinyiketela eka Xikwembu naswona u teka ku ri mutirheli\n", "2020-01-12 14:27:47,540 Validation result (greedy) at epoch 3, step 26000: bleu: 29.99, loss: 41416.1367, ppl: 4.0652, duration: 37.4468s\n", "2020-01-12 14:28:01,895 Epoch 3 Step: 26100 Batch Loss: 1.515244 Tokens per Sec: 16910, Lr: 0.000300\n", "2020-01-12 14:28:16,273 Epoch 3 Step: 26200 Batch Loss: 1.491218 Tokens per Sec: 17527, Lr: 0.000300\n", "2020-01-12 14:28:30,762 Epoch 3 Step: 26300 Batch Loss: 1.498540 Tokens per Sec: 17787, Lr: 0.000300\n", "2020-01-12 14:28:45,214 Epoch 3 Step: 26400 Batch Loss: 1.653382 Tokens per Sec: 17487, Lr: 0.000300\n", "2020-01-12 14:28:59,566 Epoch 3 Step: 26500 Batch Loss: 1.551272 Tokens per Sec: 17366, Lr: 0.000300\n", "2020-01-12 14:29:13,931 Epoch 3 Step: 26600 Batch Loss: 1.293289 Tokens per Sec: 17568, Lr: 0.000300\n", "2020-01-12 14:29:28,298 Epoch 3 Step: 26700 Batch Loss: 1.582643 Tokens per Sec: 17492, Lr: 0.000300\n", "2020-01-12 14:29:42,670 Epoch 3 Step: 26800 Batch Loss: 1.595371 Tokens per Sec: 17300, Lr: 0.000300\n", "2020-01-12 14:29:56,994 Epoch 3 Step: 26900 Batch Loss: 1.397533 Tokens per Sec: 17413, Lr: 0.000300\n", "2020-01-12 14:29:57,319 Epoch 3: total training loss 14727.24\n", "2020-01-12 14:29:57,319 EPOCH 4\n", "2020-01-12 14:30:12,550 Epoch 4 Step: 27000 Batch Loss: 1.448565 Tokens per Sec: 15792, Lr: 0.000300\n", "2020-01-12 14:30:49,706 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:30:49,707 Saving new checkpoint.\n", "2020-01-12 14:30:50,000 Example #0\n", "2020-01-12 14:30:50,000 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:30:50,000 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:30:50,000 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 14:30:50,000 Example #1\n", "2020-01-12 14:30:50,001 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:30:50,001 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:30:50,001 \tHypothesis: 3 “ Hi Fanele Hi Ku Kambisisa ”\n", "2020-01-12 14:30:50,001 Example #2\n", "2020-01-12 14:30:50,001 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:30:50,001 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:30:50,001 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:30:50,001 Example #3\n", "2020-01-12 14:30:50,002 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:30:50,002 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:30:50,002 \tHypothesis: Vuyelo byi fanekisela ku tinyiketela eka Xikwembu ni ku fungha ka munhu tanihi mutirheli\n", "2020-01-12 14:30:50,002 Validation result (greedy) at epoch 4, step 27000: bleu: 30.07, loss: 41221.9180, ppl: 4.0386, duration: 37.4512s\n", "2020-01-12 14:31:04,347 Epoch 4 Step: 27100 Batch Loss: 1.495077 Tokens per Sec: 17619, Lr: 0.000300\n", "2020-01-12 14:31:18,694 Epoch 4 Step: 27200 Batch Loss: 1.610404 Tokens per Sec: 17223, Lr: 0.000300\n", "2020-01-12 14:31:33,019 Epoch 4 Step: 27300 Batch Loss: 1.500612 Tokens per Sec: 17362, Lr: 0.000300\n", "2020-01-12 14:31:47,506 Epoch 4 Step: 27400 Batch Loss: 1.451622 Tokens per Sec: 17521, Lr: 0.000300\n", "2020-01-12 14:32:01,771 Epoch 4 Step: 27500 Batch Loss: 1.529230 Tokens per Sec: 17266, Lr: 0.000300\n", "2020-01-12 14:32:16,291 Epoch 4 Step: 27600 Batch Loss: 1.389819 Tokens per Sec: 17673, Lr: 0.000300\n", "2020-01-12 14:32:30,591 Epoch 4 Step: 27700 Batch Loss: 1.554393 Tokens per Sec: 17455, Lr: 0.000300\n", "2020-01-12 14:32:44,904 Epoch 4 Step: 27800 Batch Loss: 1.540393 Tokens per Sec: 17324, Lr: 0.000300\n", "2020-01-12 14:32:59,227 Epoch 4 Step: 27900 Batch Loss: 1.459493 Tokens per Sec: 17457, Lr: 0.000300\n", "2020-01-12 14:33:13,608 Epoch 4 Step: 28000 Batch Loss: 1.643711 Tokens per Sec: 17621, Lr: 0.000300\n", "2020-01-12 14:33:50,951 Example #0\n", "2020-01-12 14:33:50,951 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:33:50,951 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:33:50,952 \tHypothesis: Kambe hi ku hiseka , hi lava ku ntshunxiwa .\n", "2020-01-12 14:33:50,952 Example #1\n", "2020-01-12 14:33:50,952 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:33:50,952 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:33:50,952 \tHypothesis: 3 “ Hi Ku Langutana Na Wena ”\n", "2020-01-12 14:33:50,952 Example #2\n", "2020-01-12 14:33:50,952 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:33:50,953 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:33:50,953 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni ku chava , hi fanele hi va ni matshalatshala yo herisa ntshunxeko , vuhomboloki ni ya vanhu .\n", "2020-01-12 14:33:50,953 Example #3\n", "2020-01-12 14:33:50,953 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:33:50,953 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:33:50,953 \tHypothesis: Vuhundzuluxeri byi kombisa ku tinyiketela eka Xikwembu ni ku teka ku va mutirheli\n", "2020-01-12 14:33:50,953 Validation result (greedy) at epoch 4, step 28000: bleu: 29.75, loss: 41374.1406, ppl: 4.0594, duration: 37.3453s\n", "2020-01-12 14:34:05,238 Epoch 4 Step: 28100 Batch Loss: 1.434325 Tokens per Sec: 17120, Lr: 0.000300\n", "2020-01-12 14:34:19,558 Epoch 4 Step: 28200 Batch Loss: 1.403063 Tokens per Sec: 17301, Lr: 0.000300\n", "2020-01-12 14:34:33,940 Epoch 4 Step: 28300 Batch Loss: 1.365813 Tokens per Sec: 17513, Lr: 0.000300\n", "2020-01-12 14:34:48,347 Epoch 4 Step: 28400 Batch Loss: 1.622642 Tokens per Sec: 17727, Lr: 0.000300\n", "2020-01-12 14:35:02,622 Epoch 4 Step: 28500 Batch Loss: 1.524755 Tokens per Sec: 17230, Lr: 0.000300\n", "2020-01-12 14:35:16,977 Epoch 4 Step: 28600 Batch Loss: 1.492057 Tokens per Sec: 17849, Lr: 0.000300\n", "2020-01-12 14:35:31,226 Epoch 4 Step: 28700 Batch Loss: 1.370841 Tokens per Sec: 17125, Lr: 0.000300\n", "2020-01-12 14:35:45,660 Epoch 4 Step: 28800 Batch Loss: 1.466228 Tokens per Sec: 17267, Lr: 0.000300\n", "2020-01-12 14:35:59,971 Epoch 4 Step: 28900 Batch Loss: 1.564829 Tokens per Sec: 17696, Lr: 0.000300\n", "2020-01-12 14:36:14,334 Epoch 4 Step: 29000 Batch Loss: 1.749195 Tokens per Sec: 17716, Lr: 0.000300\n", "2020-01-12 14:36:51,553 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:36:51,553 Saving new checkpoint.\n", "2020-01-12 14:36:51,832 Example #0\n", "2020-01-12 14:36:51,832 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:36:51,832 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:36:51,832 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 14:36:51,832 Example #1\n", "2020-01-12 14:36:51,833 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:36:51,833 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:36:51,833 \tHypothesis: 3 “ Hi Ta Ku Langutana Na Wena ”\n", "2020-01-12 14:36:51,833 Example #2\n", "2020-01-12 14:36:51,833 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:36:51,833 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:36:51,833 \tHypothesis: Loko hi tshika swiendlo swa hina swo biha , hi fanele hi va ni matshalatshala lama nga ni vukheta yo kuma ntshunxeko , vuhomboloki , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:36:51,833 Example #3\n", "2020-01-12 14:36:51,833 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:36:51,833 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:36:51,834 \tHypothesis: Vutirheli byi fanekisela ku tinyiketela eka Xikwembu ni ku teka xiphemu xa munhu tanihi mutirheli\n", "2020-01-12 14:36:51,834 Validation result (greedy) at epoch 4, step 29000: bleu: 30.58, loss: 40627.2227, ppl: 3.9580, duration: 37.4995s\n", "2020-01-12 14:37:06,062 Epoch 4 Step: 29100 Batch Loss: 1.771435 Tokens per Sec: 16954, Lr: 0.000300\n", "2020-01-12 14:37:20,375 Epoch 4 Step: 29200 Batch Loss: 1.410100 Tokens per Sec: 17565, Lr: 0.000300\n", "2020-01-12 14:37:34,619 Epoch 4 Step: 29300 Batch Loss: 1.568475 Tokens per Sec: 17140, Lr: 0.000300\n", "2020-01-12 14:37:48,966 Epoch 4 Step: 29400 Batch Loss: 1.535724 Tokens per Sec: 17462, Lr: 0.000300\n", "2020-01-12 14:38:03,343 Epoch 4 Step: 29500 Batch Loss: 1.404746 Tokens per Sec: 17458, Lr: 0.000300\n", "2020-01-12 14:38:17,583 Epoch 4 Step: 29600 Batch Loss: 1.557012 Tokens per Sec: 17307, Lr: 0.000300\n", "2020-01-12 14:38:31,880 Epoch 4 Step: 29700 Batch Loss: 1.503369 Tokens per Sec: 17266, Lr: 0.000300\n", "2020-01-12 14:38:46,389 Epoch 4 Step: 29800 Batch Loss: 1.600176 Tokens per Sec: 17812, Lr: 0.000300\n", "2020-01-12 14:39:00,732 Epoch 4 Step: 29900 Batch Loss: 1.665188 Tokens per Sec: 17231, Lr: 0.000300\n", "2020-01-12 14:39:15,085 Epoch 4 Step: 30000 Batch Loss: 1.573647 Tokens per Sec: 17760, Lr: 0.000300\n", "2020-01-12 14:39:52,333 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:39:52,333 Saving new checkpoint.\n", "2020-01-12 14:39:52,633 Example #0\n", "2020-01-12 14:39:52,634 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:39:52,634 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:39:52,634 \tHypothesis: Kambe , hi ntumbuluko hi lava ku ntshunxiwa .\n", "2020-01-12 14:39:52,634 Example #1\n", "2020-01-12 14:39:52,634 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:39:52,634 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:39:52,634 \tHypothesis: 3 “ Mi Tirhisa ”\n", "2020-01-12 14:39:52,634 Example #2\n", "2020-01-12 14:39:52,634 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:39:52,634 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:39:52,635 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lamakulu yo kuma ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:39:52,635 Example #3\n", "2020-01-12 14:39:52,635 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:39:52,635 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:39:52,635 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha munhu tanihi mutirheli\n", "2020-01-12 14:39:52,635 Validation result (greedy) at epoch 4, step 30000: bleu: 30.57, loss: 40022.1523, ppl: 3.8778, duration: 37.5496s\n", "2020-01-12 14:40:06,971 Epoch 4 Step: 30100 Batch Loss: 1.363161 Tokens per Sec: 17269, Lr: 0.000300\n", "2020-01-12 14:40:21,316 Epoch 4 Step: 30200 Batch Loss: 1.841329 Tokens per Sec: 17571, Lr: 0.000300\n", "2020-01-12 14:40:35,650 Epoch 4 Step: 30300 Batch Loss: 1.770253 Tokens per Sec: 17522, Lr: 0.000300\n", "2020-01-12 14:40:49,993 Epoch 4 Step: 30400 Batch Loss: 1.976511 Tokens per Sec: 17643, Lr: 0.000300\n", "2020-01-12 14:41:04,353 Epoch 4 Step: 30500 Batch Loss: 1.440428 Tokens per Sec: 17732, Lr: 0.000300\n", "2020-01-12 14:41:18,615 Epoch 4 Step: 30600 Batch Loss: 1.504606 Tokens per Sec: 17290, Lr: 0.000300\n", "2020-01-12 14:41:32,964 Epoch 4 Step: 30700 Batch Loss: 1.593658 Tokens per Sec: 17346, Lr: 0.000300\n", "2020-01-12 14:41:47,340 Epoch 4 Step: 30800 Batch Loss: 1.667952 Tokens per Sec: 17565, Lr: 0.000300\n", "2020-01-12 14:42:01,753 Epoch 4 Step: 30900 Batch Loss: 1.440042 Tokens per Sec: 17765, Lr: 0.000300\n", "2020-01-12 14:42:16,080 Epoch 4 Step: 31000 Batch Loss: 1.367240 Tokens per Sec: 17747, Lr: 0.000300\n", "2020-01-12 14:42:53,206 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:42:53,206 Saving new checkpoint.\n", "2020-01-12 14:42:53,529 Example #0\n", "2020-01-12 14:42:53,529 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:42:53,529 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:42:53,529 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxiwa .\n", "2020-01-12 14:42:53,529 Example #1\n", "2020-01-12 14:42:53,530 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:42:53,530 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:42:53,530 \tHypothesis: 3 “ Hi Tiyisela Ku Endla Leswaku U Va Ni Ntirho ”\n", "2020-01-12 14:42:53,530 Example #2\n", "2020-01-12 14:42:53,530 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:42:53,530 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:42:53,531 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama nga ni vukanganyisi yo herisa ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:42:53,531 Example #3\n", "2020-01-12 14:42:53,531 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:42:53,531 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:42:53,531 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ka munhu tanihi mutirheli\n", "2020-01-12 14:42:53,531 Validation result (greedy) at epoch 4, step 31000: bleu: 31.36, loss: 39652.4688, ppl: 3.8295, duration: 37.4508s\n", "2020-01-12 14:43:07,900 Epoch 4 Step: 31100 Batch Loss: 1.509464 Tokens per Sec: 17621, Lr: 0.000300\n", "2020-01-12 14:43:22,211 Epoch 4 Step: 31200 Batch Loss: 1.381702 Tokens per Sec: 17374, Lr: 0.000300\n", "2020-01-12 14:43:36,485 Epoch 4 Step: 31300 Batch Loss: 1.694663 Tokens per Sec: 17823, Lr: 0.000300\n", "2020-01-12 14:43:50,814 Epoch 4 Step: 31400 Batch Loss: 1.326799 Tokens per Sec: 17324, Lr: 0.000300\n", "2020-01-12 14:44:05,238 Epoch 4 Step: 31500 Batch Loss: 1.513231 Tokens per Sec: 17673, Lr: 0.000300\n", "2020-01-12 14:44:19,491 Epoch 4 Step: 31600 Batch Loss: 1.589839 Tokens per Sec: 17691, Lr: 0.000300\n", "2020-01-12 14:44:33,808 Epoch 4 Step: 31700 Batch Loss: 1.593611 Tokens per Sec: 17429, Lr: 0.000300\n", "2020-01-12 14:44:48,057 Epoch 4 Step: 31800 Batch Loss: 1.510215 Tokens per Sec: 17688, Lr: 0.000300\n", "2020-01-12 14:45:02,324 Epoch 4 Step: 31900 Batch Loss: 1.238107 Tokens per Sec: 17157, Lr: 0.000300\n", "2020-01-12 14:45:16,635 Epoch 4 Step: 32000 Batch Loss: 1.587565 Tokens per Sec: 17589, Lr: 0.000300\n", "2020-01-12 14:45:53,758 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:45:53,758 Saving new checkpoint.\n", "2020-01-12 14:45:54,053 Example #0\n", "2020-01-12 14:45:54,053 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:45:54,053 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:45:54,053 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxiwa .\n", "2020-01-12 14:45:54,053 Example #1\n", "2020-01-12 14:45:54,054 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:45:54,054 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:45:54,054 \tHypothesis: 3 “ Hi Lwela Ku Ku Kombisa ”\n", "2020-01-12 14:45:54,054 Example #2\n", "2020-01-12 14:45:54,054 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:45:54,054 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:45:54,054 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lama hlamarisaka yo endla ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:45:54,054 Example #3\n", "2020-01-12 14:45:54,054 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:45:54,055 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:45:54,055 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 14:45:54,055 Validation result (greedy) at epoch 4, step 32000: bleu: 31.38, loss: 39589.6055, ppl: 3.8214, duration: 37.4193s\n", "2020-01-12 14:46:08,390 Epoch 4 Step: 32100 Batch Loss: 1.414032 Tokens per Sec: 17284, Lr: 0.000300\n", "2020-01-12 14:46:22,659 Epoch 4 Step: 32200 Batch Loss: 1.437150 Tokens per Sec: 17440, Lr: 0.000300\n", "2020-01-12 14:46:37,002 Epoch 4 Step: 32300 Batch Loss: 1.454321 Tokens per Sec: 17352, Lr: 0.000300\n", "2020-01-12 14:46:51,303 Epoch 4 Step: 32400 Batch Loss: 1.376225 Tokens per Sec: 17662, Lr: 0.000300\n", "2020-01-12 14:47:05,600 Epoch 4 Step: 32500 Batch Loss: 1.418253 Tokens per Sec: 17706, Lr: 0.000300\n", "2020-01-12 14:47:19,927 Epoch 4 Step: 32600 Batch Loss: 1.506431 Tokens per Sec: 17677, Lr: 0.000300\n", "2020-01-12 14:47:34,171 Epoch 4 Step: 32700 Batch Loss: 1.557303 Tokens per Sec: 17564, Lr: 0.000300\n", "2020-01-12 14:47:48,496 Epoch 4 Step: 32800 Batch Loss: 1.508148 Tokens per Sec: 17305, Lr: 0.000300\n", "2020-01-12 14:48:02,799 Epoch 4 Step: 32900 Batch Loss: 1.430270 Tokens per Sec: 17495, Lr: 0.000300\n", "2020-01-12 14:48:17,098 Epoch 4 Step: 33000 Batch Loss: 1.497375 Tokens per Sec: 17341, Lr: 0.000300\n", "2020-01-12 14:48:54,167 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:48:54,167 Saving new checkpoint.\n", "2020-01-12 14:48:54,473 Example #0\n", "2020-01-12 14:48:54,473 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:48:54,473 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:48:54,473 \tHypothesis: Kambe , hi lava ku ntshunxiwa .\n", "2020-01-12 14:48:54,473 Example #1\n", "2020-01-12 14:48:54,473 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:48:54,474 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:48:54,474 \tHypothesis: 3 “ Hi Fanele Hi Ku Kombisa ”\n", "2020-01-12 14:48:54,474 Example #2\n", "2020-01-12 14:48:54,474 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:48:54,474 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:48:54,474 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lama tiyeke yo herisa ntshunxeko , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:48:54,474 Example #3\n", "2020-01-12 14:48:54,474 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:48:54,475 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:48:54,475 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku teka ku fuma ka wena tanihi mutirheli\n", "2020-01-12 14:48:54,475 Validation result (greedy) at epoch 4, step 33000: bleu: 31.41, loss: 39230.7031, ppl: 3.7752, duration: 37.3761s\n", "2020-01-12 14:49:08,889 Epoch 4 Step: 33100 Batch Loss: 1.502866 Tokens per Sec: 17581, Lr: 0.000300\n", "2020-01-12 14:49:23,281 Epoch 4 Step: 33200 Batch Loss: 1.638937 Tokens per Sec: 17612, Lr: 0.000300\n", "2020-01-12 14:49:37,573 Epoch 4 Step: 33300 Batch Loss: 1.626535 Tokens per Sec: 17324, Lr: 0.000300\n", "2020-01-12 14:49:52,016 Epoch 4 Step: 33400 Batch Loss: 1.820437 Tokens per Sec: 17598, Lr: 0.000300\n", "2020-01-12 14:50:06,311 Epoch 4 Step: 33500 Batch Loss: 1.453813 Tokens per Sec: 17234, Lr: 0.000300\n", "2020-01-12 14:50:20,652 Epoch 4 Step: 33600 Batch Loss: 1.485858 Tokens per Sec: 17427, Lr: 0.000300\n", "2020-01-12 14:50:34,962 Epoch 4 Step: 33700 Batch Loss: 1.502579 Tokens per Sec: 17516, Lr: 0.000300\n", "2020-01-12 14:50:49,296 Epoch 4 Step: 33800 Batch Loss: 1.174403 Tokens per Sec: 17411, Lr: 0.000300\n", "2020-01-12 14:51:03,664 Epoch 4 Step: 33900 Batch Loss: 1.632755 Tokens per Sec: 17540, Lr: 0.000300\n", "2020-01-12 14:51:18,009 Epoch 4 Step: 34000 Batch Loss: 1.437027 Tokens per Sec: 17632, Lr: 0.000300\n", "2020-01-12 14:51:55,213 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:51:55,213 Saving new checkpoint.\n", "2020-01-12 14:51:55,559 Example #0\n", "2020-01-12 14:51:55,559 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:51:55,559 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:51:55,559 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 14:51:55,559 Example #1\n", "2020-01-12 14:51:55,560 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:51:55,560 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:51:55,560 \tHypothesis: 3 “ Hi Languta N’wina ”\n", "2020-01-12 14:51:55,560 Example #2\n", "2020-01-12 14:51:55,560 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:51:55,560 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:51:55,560 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama nga ni matimba yo herisa ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:51:55,560 Example #3\n", "2020-01-12 14:51:55,560 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:51:55,560 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:51:55,561 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 14:51:55,561 Validation result (greedy) at epoch 4, step 34000: bleu: 31.85, loss: 38992.8711, ppl: 3.7449, duration: 37.5512s\n", "2020-01-12 14:52:09,941 Epoch 4 Step: 34100 Batch Loss: 1.576459 Tokens per Sec: 17612, Lr: 0.000300\n", "2020-01-12 14:52:24,324 Epoch 4 Step: 34200 Batch Loss: 1.715775 Tokens per Sec: 17619, Lr: 0.000300\n", "2020-01-12 14:52:38,625 Epoch 4 Step: 34300 Batch Loss: 1.278049 Tokens per Sec: 17542, Lr: 0.000300\n", "2020-01-12 14:52:52,895 Epoch 4 Step: 34400 Batch Loss: 1.407421 Tokens per Sec: 17388, Lr: 0.000300\n", "2020-01-12 14:53:07,262 Epoch 4 Step: 34500 Batch Loss: 1.375490 Tokens per Sec: 17654, Lr: 0.000300\n", "2020-01-12 14:53:21,637 Epoch 4 Step: 34600 Batch Loss: 1.876229 Tokens per Sec: 17496, Lr: 0.000300\n", "2020-01-12 14:53:35,939 Epoch 4 Step: 34700 Batch Loss: 1.707477 Tokens per Sec: 16972, Lr: 0.000300\n", "2020-01-12 14:53:50,359 Epoch 4 Step: 34800 Batch Loss: 1.598695 Tokens per Sec: 17543, Lr: 0.000300\n", "2020-01-12 14:54:04,618 Epoch 4 Step: 34900 Batch Loss: 1.504966 Tokens per Sec: 16970, Lr: 0.000300\n", "2020-01-12 14:54:19,060 Epoch 4 Step: 35000 Batch Loss: 1.545066 Tokens per Sec: 17383, Lr: 0.000300\n", "2020-01-12 14:54:56,347 Hooray! New best validation result [ppl]!\n", "2020-01-12 14:54:56,348 Saving new checkpoint.\n", "2020-01-12 14:54:56,665 Example #0\n", "2020-01-12 14:54:56,665 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:54:56,665 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:54:56,666 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 14:54:56,666 Example #1\n", "2020-01-12 14:54:56,666 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:54:56,666 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:54:56,666 \tHypothesis: 3 “ Hi Yingisa N’wina ”\n", "2020-01-12 14:54:56,666 Example #2\n", "2020-01-12 14:54:56,666 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:54:56,666 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:54:56,666 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo herisa ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 14:54:56,666 Example #3\n", "2020-01-12 14:54:56,667 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:54:56,667 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:54:56,667 \tHypothesis: Baptism u kombisa ku tinyiketela eka Xikwembu ni ku fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 14:54:56,667 Validation result (greedy) at epoch 4, step 35000: bleu: 32.42, loss: 38774.5156, ppl: 3.7173, duration: 37.6067s\n", "2020-01-12 14:55:11,017 Epoch 4 Step: 35100 Batch Loss: 1.460916 Tokens per Sec: 17305, Lr: 0.000300\n", "2020-01-12 14:55:25,432 Epoch 4 Step: 35200 Batch Loss: 1.457319 Tokens per Sec: 17471, Lr: 0.000300\n", "2020-01-12 14:55:39,796 Epoch 4 Step: 35300 Batch Loss: 1.451526 Tokens per Sec: 17590, Lr: 0.000300\n", "2020-01-12 14:55:54,148 Epoch 4 Step: 35400 Batch Loss: 1.358786 Tokens per Sec: 17201, Lr: 0.000300\n", "2020-01-12 14:56:08,539 Epoch 4 Step: 35500 Batch Loss: 1.707052 Tokens per Sec: 17500, Lr: 0.000300\n", "2020-01-12 14:56:22,923 Epoch 4 Step: 35600 Batch Loss: 1.717293 Tokens per Sec: 17181, Lr: 0.000300\n", "2020-01-12 14:56:37,329 Epoch 4 Step: 35700 Batch Loss: 1.651282 Tokens per Sec: 17557, Lr: 0.000300\n", "2020-01-12 14:56:51,764 Epoch 4 Step: 35800 Batch Loss: 1.294732 Tokens per Sec: 17676, Lr: 0.000300\n", "2020-01-12 14:57:02,210 Epoch 4: total training loss 13600.18\n", "2020-01-12 14:57:02,210 EPOCH 5\n", "2020-01-12 14:57:07,417 Epoch 5 Step: 35900 Batch Loss: 1.536982 Tokens per Sec: 13600, Lr: 0.000300\n", "2020-01-12 14:57:21,631 Epoch 5 Step: 36000 Batch Loss: 1.576982 Tokens per Sec: 16881, Lr: 0.000300\n", "2020-01-12 14:57:59,098 Example #0\n", "2020-01-12 14:57:59,098 \tSource: But , naturally , we also want relief .\n", "2020-01-12 14:57:59,099 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 14:57:59,099 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 14:57:59,099 Example #1\n", "2020-01-12 14:57:59,099 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 14:57:59,099 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 14:57:59,099 \tHypothesis: 3 “ Hi Fanele Hi Ku Tirhisa ”\n", "2020-01-12 14:57:59,099 Example #2\n", "2020-01-12 14:57:59,100 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 14:57:59,100 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 14:57:59,100 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo nyikela ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 14:57:59,100 Example #3\n", "2020-01-12 14:57:59,100 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 14:57:59,100 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 14:57:59,100 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha ndlela leyi munhu a nga mutirheli ha yona\n", "2020-01-12 14:57:59,100 Validation result (greedy) at epoch 5, step 36000: bleu: 32.23, loss: 38793.9375, ppl: 3.7198, duration: 37.4694s\n", "2020-01-12 14:58:13,528 Epoch 5 Step: 36100 Batch Loss: 1.337618 Tokens per Sec: 17477, Lr: 0.000300\n", "2020-01-12 14:58:28,002 Epoch 5 Step: 36200 Batch Loss: 1.503850 Tokens per Sec: 17341, Lr: 0.000300\n", "2020-01-12 14:58:42,343 Epoch 5 Step: 36300 Batch Loss: 1.340993 Tokens per Sec: 17371, Lr: 0.000300\n", "2020-01-12 14:58:56,856 Epoch 5 Step: 36400 Batch Loss: 1.303088 Tokens per Sec: 17656, Lr: 0.000300\n", "2020-01-12 14:59:11,266 Epoch 5 Step: 36500 Batch Loss: 1.514919 Tokens per Sec: 17461, Lr: 0.000300\n", "2020-01-12 14:59:25,649 Epoch 5 Step: 36600 Batch Loss: 1.338880 Tokens per Sec: 17133, Lr: 0.000300\n", "2020-01-12 14:59:40,152 Epoch 5 Step: 36700 Batch Loss: 1.599659 Tokens per Sec: 17809, Lr: 0.000300\n", "2020-01-12 14:59:54,487 Epoch 5 Step: 36800 Batch Loss: 1.365993 Tokens per Sec: 17174, Lr: 0.000300\n", "2020-01-12 15:00:08,901 Epoch 5 Step: 36900 Batch Loss: 1.492077 Tokens per Sec: 17584, Lr: 0.000300\n", "2020-01-12 15:00:23,384 Epoch 5 Step: 37000 Batch Loss: 1.511816 Tokens per Sec: 17514, Lr: 0.000300\n", "2020-01-12 15:01:00,737 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:01:00,737 Saving new checkpoint.\n", "2020-01-12 15:01:01,068 Example #0\n", "2020-01-12 15:01:01,068 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:01:01,068 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:01:01,068 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxiwa .\n", "2020-01-12 15:01:01,068 Example #1\n", "2020-01-12 15:01:01,068 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:01:01,068 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:01:01,069 \tHypothesis: 3 “ Hi Tirhisa U N’wi Vangalafula ”\n", "2020-01-12 15:01:01,069 Example #2\n", "2020-01-12 15:01:01,069 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:01:01,069 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:01:01,069 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 15:01:01,069 Example #3\n", "2020-01-12 15:01:01,069 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:01:01,069 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:01:01,069 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ka munhu tanihi mutirheli\n", "2020-01-12 15:01:01,069 Validation result (greedy) at epoch 5, step 37000: bleu: 32.07, loss: 38262.8398, ppl: 3.6535, duration: 37.6846s\n", "2020-01-12 15:01:15,421 Epoch 5 Step: 37100 Batch Loss: 1.358166 Tokens per Sec: 17374, Lr: 0.000300\n", "2020-01-12 15:01:29,727 Epoch 5 Step: 37200 Batch Loss: 1.641356 Tokens per Sec: 17011, Lr: 0.000300\n", "2020-01-12 15:01:44,185 Epoch 5 Step: 37300 Batch Loss: 1.554594 Tokens per Sec: 17094, Lr: 0.000300\n", "2020-01-12 15:01:58,505 Epoch 5 Step: 37400 Batch Loss: 1.490752 Tokens per Sec: 17179, Lr: 0.000300\n", "2020-01-12 15:02:13,029 Epoch 5 Step: 37500 Batch Loss: 1.358755 Tokens per Sec: 17721, Lr: 0.000300\n", "2020-01-12 15:02:27,495 Epoch 5 Step: 37600 Batch Loss: 1.450234 Tokens per Sec: 17520, Lr: 0.000300\n", "2020-01-12 15:02:41,961 Epoch 5 Step: 37700 Batch Loss: 1.638605 Tokens per Sec: 17435, Lr: 0.000300\n", "2020-01-12 15:02:56,360 Epoch 5 Step: 37800 Batch Loss: 1.590841 Tokens per Sec: 17417, Lr: 0.000300\n", "2020-01-12 15:03:10,782 Epoch 5 Step: 37900 Batch Loss: 1.442210 Tokens per Sec: 17765, Lr: 0.000300\n", "2020-01-12 15:03:25,221 Epoch 5 Step: 38000 Batch Loss: 1.796350 Tokens per Sec: 17351, Lr: 0.000300\n", "2020-01-12 15:04:02,555 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:04:02,555 Saving new checkpoint.\n", "2020-01-12 15:04:02,900 Example #0\n", "2020-01-12 15:04:02,900 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:04:02,900 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:04:02,900 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 15:04:02,900 Example #1\n", "2020-01-12 15:04:02,901 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:04:02,901 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:04:02,901 \tHypothesis: 3 “ Hi Fanele Hi Ku Tirhisa ”\n", "2020-01-12 15:04:02,901 Example #2\n", "2020-01-12 15:04:02,901 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:04:02,901 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:04:02,901 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kurisa ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:04:02,901 Example #3\n", "2020-01-12 15:04:02,901 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:04:02,901 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:04:02,902 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ka munhu tanihi mutirheli\n", "2020-01-12 15:04:02,902 Validation result (greedy) at epoch 5, step 38000: bleu: 32.46, loss: 37997.4258, ppl: 3.6208, duration: 37.6808s\n", "2020-01-12 15:04:17,387 Epoch 5 Step: 38100 Batch Loss: 1.433262 Tokens per Sec: 17312, Lr: 0.000300\n", "2020-01-12 15:04:31,795 Epoch 5 Step: 38200 Batch Loss: 1.303422 Tokens per Sec: 17268, Lr: 0.000300\n", "2020-01-12 15:04:46,344 Epoch 5 Step: 38300 Batch Loss: 1.302938 Tokens per Sec: 17376, Lr: 0.000300\n", "2020-01-12 15:05:00,850 Epoch 5 Step: 38400 Batch Loss: 1.390227 Tokens per Sec: 17660, Lr: 0.000300\n", "2020-01-12 15:05:15,258 Epoch 5 Step: 38500 Batch Loss: 1.559957 Tokens per Sec: 17328, Lr: 0.000300\n", "2020-01-12 15:05:29,723 Epoch 5 Step: 38600 Batch Loss: 1.407579 Tokens per Sec: 17700, Lr: 0.000300\n", "2020-01-12 15:05:44,201 Epoch 5 Step: 38700 Batch Loss: 1.254171 Tokens per Sec: 17536, Lr: 0.000300\n", "2020-01-12 15:05:58,581 Epoch 5 Step: 38800 Batch Loss: 1.626912 Tokens per Sec: 17350, Lr: 0.000300\n", "2020-01-12 15:06:13,065 Epoch 5 Step: 38900 Batch Loss: 1.484539 Tokens per Sec: 17511, Lr: 0.000300\n", "2020-01-12 15:06:27,475 Epoch 5 Step: 39000 Batch Loss: 1.352845 Tokens per Sec: 17416, Lr: 0.000300\n", "2020-01-12 15:07:04,737 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:07:04,737 Saving new checkpoint.\n", "2020-01-12 15:07:05,023 Example #0\n", "2020-01-12 15:07:05,024 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:07:05,024 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:07:05,024 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxiwa .\n", "2020-01-12 15:07:05,024 Example #1\n", "2020-01-12 15:07:05,025 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:07:05,025 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:07:05,025 \tHypothesis: 3 “ Hi Tirhisa U N’wi Vonakanyile ”\n", "2020-01-12 15:07:05,025 Example #2\n", "2020-01-12 15:07:05,026 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:07:05,026 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:07:05,026 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku tidlaya , hi fanele hi va ni matshalatshala lama engetelekeke yo ntshunxa ntshunxeko , ku va ni vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:07:05,026 Example #3\n", "2020-01-12 15:07:05,026 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:07:05,026 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:07:05,027 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ndhuma ya munhu tanihi mutirheli\n", "2020-01-12 15:07:05,027 Validation result (greedy) at epoch 5, step 39000: bleu: 32.74, loss: 37956.5352, ppl: 3.6158, duration: 37.5507s\n", "2020-01-12 15:07:19,531 Epoch 5 Step: 39100 Batch Loss: 1.462075 Tokens per Sec: 17432, Lr: 0.000300\n", "2020-01-12 15:07:33,927 Epoch 5 Step: 39200 Batch Loss: 1.337172 Tokens per Sec: 17288, Lr: 0.000300\n", "2020-01-12 15:07:48,308 Epoch 5 Step: 39300 Batch Loss: 1.486801 Tokens per Sec: 17278, Lr: 0.000300\n", "2020-01-12 15:08:02,685 Epoch 5 Step: 39400 Batch Loss: 1.370221 Tokens per Sec: 17649, Lr: 0.000300\n", "2020-01-12 15:08:17,051 Epoch 5 Step: 39500 Batch Loss: 1.392747 Tokens per Sec: 17183, Lr: 0.000300\n", "2020-01-12 15:08:31,429 Epoch 5 Step: 39600 Batch Loss: 1.464254 Tokens per Sec: 17594, Lr: 0.000300\n", "2020-01-12 15:08:45,780 Epoch 5 Step: 39700 Batch Loss: 1.463673 Tokens per Sec: 17268, Lr: 0.000300\n", "2020-01-12 15:09:00,185 Epoch 5 Step: 39800 Batch Loss: 1.445154 Tokens per Sec: 17206, Lr: 0.000300\n", "2020-01-12 15:09:14,522 Epoch 5 Step: 39900 Batch Loss: 1.643402 Tokens per Sec: 17490, Lr: 0.000300\n", "2020-01-12 15:09:28,838 Epoch 5 Step: 40000 Batch Loss: 1.417121 Tokens per Sec: 17127, Lr: 0.000300\n", "2020-01-12 15:10:06,095 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:10:06,095 Saving new checkpoint.\n", "2020-01-12 15:10:06,406 Example #0\n", "2020-01-12 15:10:06,407 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:10:06,407 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:10:06,407 \tHypothesis: Kambe , hi ntumbuluko hi lava ku ntshunxiwa .\n", "2020-01-12 15:10:06,407 Example #1\n", "2020-01-12 15:10:06,407 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:10:06,407 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:10:06,407 \tHypothesis: 3 “ Hi Ku Tirhela ”\n", "2020-01-12 15:10:06,407 Example #2\n", "2020-01-12 15:10:06,407 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:10:06,407 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:10:06,408 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 15:10:06,408 Example #3\n", "2020-01-12 15:10:06,408 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:10:06,408 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:10:06,408 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:10:06,408 Validation result (greedy) at epoch 5, step 40000: bleu: 32.51, loss: 37645.3047, ppl: 3.5779, duration: 37.5695s\n", "2020-01-12 15:10:20,847 Epoch 5 Step: 40100 Batch Loss: 1.529551 Tokens per Sec: 17806, Lr: 0.000300\n", "2020-01-12 15:10:35,159 Epoch 5 Step: 40200 Batch Loss: 1.940098 Tokens per Sec: 16972, Lr: 0.000300\n", "2020-01-12 15:10:49,558 Epoch 5 Step: 40300 Batch Loss: 1.552458 Tokens per Sec: 17262, Lr: 0.000300\n", "2020-01-12 15:11:03,947 Epoch 5 Step: 40400 Batch Loss: 1.287683 Tokens per Sec: 17390, Lr: 0.000300\n", "2020-01-12 15:11:18,331 Epoch 5 Step: 40500 Batch Loss: 1.416657 Tokens per Sec: 17464, Lr: 0.000300\n", "2020-01-12 15:11:32,694 Epoch 5 Step: 40600 Batch Loss: 2.033160 Tokens per Sec: 17036, Lr: 0.000300\n", "2020-01-12 15:11:47,127 Epoch 5 Step: 40700 Batch Loss: 1.428688 Tokens per Sec: 17272, Lr: 0.000300\n", "2020-01-12 15:12:01,561 Epoch 5 Step: 40800 Batch Loss: 1.494907 Tokens per Sec: 17477, Lr: 0.000300\n", "2020-01-12 15:12:15,870 Epoch 5 Step: 40900 Batch Loss: 1.338884 Tokens per Sec: 17162, Lr: 0.000300\n", "2020-01-12 15:12:30,189 Epoch 5 Step: 41000 Batch Loss: 1.318781 Tokens per Sec: 17231, Lr: 0.000300\n", "2020-01-12 15:13:07,459 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:13:07,460 Saving new checkpoint.\n", "2020-01-12 15:13:07,776 Example #0\n", "2020-01-12 15:13:07,776 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:13:07,776 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:13:07,776 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 15:13:07,776 Example #1\n", "2020-01-12 15:13:07,776 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:13:07,776 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:13:07,776 \tHypothesis: 3 “ Hi Tirhisa U N’wi Vonile ”\n", "2020-01-12 15:13:07,777 Example #2\n", "2020-01-12 15:13:07,777 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:13:07,777 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:13:07,777 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 15:13:07,777 Example #3\n", "2020-01-12 15:13:07,777 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:13:07,777 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:13:07,777 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 15:13:07,777 Validation result (greedy) at epoch 5, step 41000: bleu: 33.17, loss: 37480.3945, ppl: 3.5580, duration: 37.5882s\n", "2020-01-12 15:13:22,150 Epoch 5 Step: 41100 Batch Loss: 1.389886 Tokens per Sec: 17576, Lr: 0.000300\n", "2020-01-12 15:13:36,478 Epoch 5 Step: 41200 Batch Loss: 1.529689 Tokens per Sec: 17514, Lr: 0.000300\n", "2020-01-12 15:13:50,827 Epoch 5 Step: 41300 Batch Loss: 1.514939 Tokens per Sec: 17285, Lr: 0.000300\n", "2020-01-12 15:14:05,176 Epoch 5 Step: 41400 Batch Loss: 1.337812 Tokens per Sec: 17588, Lr: 0.000300\n", "2020-01-12 15:14:19,558 Epoch 5 Step: 41500 Batch Loss: 1.629494 Tokens per Sec: 17489, Lr: 0.000300\n", "2020-01-12 15:14:33,905 Epoch 5 Step: 41600 Batch Loss: 1.528360 Tokens per Sec: 17549, Lr: 0.000300\n", "2020-01-12 15:14:48,236 Epoch 5 Step: 41700 Batch Loss: 1.580790 Tokens per Sec: 17337, Lr: 0.000300\n", "2020-01-12 15:15:02,735 Epoch 5 Step: 41800 Batch Loss: 1.378512 Tokens per Sec: 17341, Lr: 0.000300\n", "2020-01-12 15:15:17,151 Epoch 5 Step: 41900 Batch Loss: 1.400483 Tokens per Sec: 17944, Lr: 0.000300\n", "2020-01-12 15:15:31,404 Epoch 5 Step: 42000 Batch Loss: 1.318969 Tokens per Sec: 17078, Lr: 0.000300\n", "2020-01-12 15:16:08,788 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:16:08,788 Saving new checkpoint.\n", "2020-01-12 15:16:09,124 Example #0\n", "2020-01-12 15:16:09,125 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:16:09,125 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:16:09,125 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxiwa .\n", "2020-01-12 15:16:09,125 Example #1\n", "2020-01-12 15:16:09,126 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:16:09,126 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:16:09,126 \tHypothesis: 3 “ Hi Ku Tiva U Ri Kona ”\n", "2020-01-12 15:16:09,126 Example #2\n", "2020-01-12 15:16:09,126 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:16:09,126 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:16:09,126 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , vululami , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 15:16:09,126 Example #3\n", "2020-01-12 15:16:09,126 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:16:09,126 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:16:09,127 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 15:16:09,127 Validation result (greedy) at epoch 5, step 42000: bleu: 33.40, loss: 37400.1211, ppl: 3.5483, duration: 37.7220s\n", "2020-01-12 15:16:23,502 Epoch 5 Step: 42100 Batch Loss: 1.148479 Tokens per Sec: 17204, Lr: 0.000300\n", "2020-01-12 15:16:37,882 Epoch 5 Step: 42200 Batch Loss: 1.391523 Tokens per Sec: 17543, Lr: 0.000300\n", "2020-01-12 15:16:52,241 Epoch 5 Step: 42300 Batch Loss: 1.506419 Tokens per Sec: 17373, Lr: 0.000300\n", "2020-01-12 15:17:06,657 Epoch 5 Step: 42400 Batch Loss: 1.226855 Tokens per Sec: 17511, Lr: 0.000300\n", "2020-01-12 15:17:21,019 Epoch 5 Step: 42500 Batch Loss: 1.646515 Tokens per Sec: 17454, Lr: 0.000300\n", "2020-01-12 15:17:35,423 Epoch 5 Step: 42600 Batch Loss: 1.445156 Tokens per Sec: 17541, Lr: 0.000300\n", "2020-01-12 15:17:49,848 Epoch 5 Step: 42700 Batch Loss: 1.491799 Tokens per Sec: 17723, Lr: 0.000300\n", "2020-01-12 15:18:04,149 Epoch 5 Step: 42800 Batch Loss: 1.417617 Tokens per Sec: 17218, Lr: 0.000300\n", "2020-01-12 15:18:18,502 Epoch 5 Step: 42900 Batch Loss: 1.255283 Tokens per Sec: 17245, Lr: 0.000300\n", "2020-01-12 15:18:32,944 Epoch 5 Step: 43000 Batch Loss: 1.149139 Tokens per Sec: 17196, Lr: 0.000300\n", "2020-01-12 15:19:10,267 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:19:10,267 Saving new checkpoint.\n", "2020-01-12 15:19:10,576 Example #0\n", "2020-01-12 15:19:10,576 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:19:10,577 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:19:10,577 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxeka .\n", "2020-01-12 15:19:10,577 Example #1\n", "2020-01-12 15:19:10,577 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:19:10,577 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:19:10,577 \tHypothesis: 3 “ Hi Ku Lwisana Na Wena ”\n", "2020-01-12 15:19:10,577 Example #2\n", "2020-01-12 15:19:10,577 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:19:10,577 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:19:10,577 \tHypothesis: Ku tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama fanaka yo kuma ntshunxeko , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:19:10,577 Example #3\n", "2020-01-12 15:19:10,578 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:19:10,578 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:19:10,578 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:19:10,578 Validation result (greedy) at epoch 5, step 43000: bleu: 33.06, loss: 37119.0000, ppl: 3.5147, duration: 37.6336s\n", "2020-01-12 15:19:25,076 Epoch 5 Step: 43100 Batch Loss: 1.431666 Tokens per Sec: 17273, Lr: 0.000300\n", "2020-01-12 15:19:39,516 Epoch 5 Step: 43200 Batch Loss: 1.590371 Tokens per Sec: 17597, Lr: 0.000300\n", "2020-01-12 15:19:53,946 Epoch 5 Step: 43300 Batch Loss: 1.309256 Tokens per Sec: 17636, Lr: 0.000300\n", "2020-01-12 15:20:08,409 Epoch 5 Step: 43400 Batch Loss: 1.328098 Tokens per Sec: 17415, Lr: 0.000300\n", "2020-01-12 15:20:22,756 Epoch 5 Step: 43500 Batch Loss: 1.595765 Tokens per Sec: 17290, Lr: 0.000300\n", "2020-01-12 15:20:37,058 Epoch 5 Step: 43600 Batch Loss: 1.492647 Tokens per Sec: 17477, Lr: 0.000300\n", "2020-01-12 15:20:51,450 Epoch 5 Step: 43700 Batch Loss: 1.485634 Tokens per Sec: 17285, Lr: 0.000300\n", "2020-01-12 15:21:05,825 Epoch 5 Step: 43800 Batch Loss: 1.449124 Tokens per Sec: 17358, Lr: 0.000300\n", "2020-01-12 15:21:20,188 Epoch 5 Step: 43900 Batch Loss: 1.511950 Tokens per Sec: 17302, Lr: 0.000300\n", "2020-01-12 15:21:34,591 Epoch 5 Step: 44000 Batch Loss: 1.335942 Tokens per Sec: 17622, Lr: 0.000300\n", "2020-01-12 15:22:12,059 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:22:12,059 Saving new checkpoint.\n", "2020-01-12 15:22:12,359 Example #0\n", "2020-01-12 15:22:12,359 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:22:12,359 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:22:12,360 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 15:22:12,360 Example #1\n", "2020-01-12 15:22:12,360 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:22:12,360 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:22:12,360 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 15:22:12,360 Example #2\n", "2020-01-12 15:22:12,360 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:22:12,360 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:22:12,360 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximeka , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:22:12,360 Example #3\n", "2020-01-12 15:22:12,361 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:22:12,361 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:22:12,361 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ka munhu tanihi mutirheli\n", "2020-01-12 15:22:12,361 Validation result (greedy) at epoch 5, step 44000: bleu: 33.26, loss: 37003.7617, ppl: 3.5010, duration: 37.7690s\n", "2020-01-12 15:22:26,835 Epoch 5 Step: 44100 Batch Loss: 1.452604 Tokens per Sec: 17600, Lr: 0.000300\n", "2020-01-12 15:22:41,179 Epoch 5 Step: 44200 Batch Loss: 1.496922 Tokens per Sec: 17118, Lr: 0.000300\n", "2020-01-12 15:22:55,673 Epoch 5 Step: 44300 Batch Loss: 1.306487 Tokens per Sec: 17581, Lr: 0.000300\n", "2020-01-12 15:23:10,018 Epoch 5 Step: 44400 Batch Loss: 1.604735 Tokens per Sec: 17526, Lr: 0.000300\n", "2020-01-12 15:23:24,444 Epoch 5 Step: 44500 Batch Loss: 1.338882 Tokens per Sec: 17204, Lr: 0.000300\n", "2020-01-12 15:23:38,837 Epoch 5 Step: 44600 Batch Loss: 1.444311 Tokens per Sec: 17278, Lr: 0.000300\n", "2020-01-12 15:23:53,310 Epoch 5 Step: 44700 Batch Loss: 1.162775 Tokens per Sec: 17347, Lr: 0.000300\n", "2020-01-12 15:24:07,657 Epoch 5 Step: 44800 Batch Loss: 1.345309 Tokens per Sec: 17199, Lr: 0.000300\n", "2020-01-12 15:24:13,371 Epoch 5: total training loss 12863.80\n", "2020-01-12 15:24:13,371 EPOCH 6\n", "2020-01-12 15:24:23,371 Epoch 6 Step: 44900 Batch Loss: 1.547712 Tokens per Sec: 15293, Lr: 0.000300\n", "2020-01-12 15:24:37,603 Epoch 6 Step: 45000 Batch Loss: 1.310931 Tokens per Sec: 17458, Lr: 0.000300\n", "2020-01-12 15:25:15,025 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:25:15,025 Saving new checkpoint.\n", "2020-01-12 15:25:15,331 Example #0\n", "2020-01-12 15:25:15,332 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:25:15,332 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:25:15,332 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxeka .\n", "2020-01-12 15:25:15,332 Example #1\n", "2020-01-12 15:25:15,333 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:25:15,333 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:25:15,333 \tHypothesis: 3 “ Hi Ku Tirhisa ”\n", "2020-01-12 15:25:15,333 Example #2\n", "2020-01-12 15:25:15,333 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:25:15,333 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:25:15,333 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kurisa ntshunxeko , ku va ni vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:25:15,334 Example #3\n", "2020-01-12 15:25:15,334 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:25:15,334 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:25:15,334 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku teka munhu a ri mutirheli\n", "2020-01-12 15:25:15,334 Validation result (greedy) at epoch 6, step 45000: bleu: 33.86, loss: 36675.6992, ppl: 3.4623, duration: 37.7303s\n", "2020-01-12 15:25:29,763 Epoch 6 Step: 45100 Batch Loss: 1.353897 Tokens per Sec: 17213, Lr: 0.000300\n", "2020-01-12 15:25:44,259 Epoch 6 Step: 45200 Batch Loss: 1.512908 Tokens per Sec: 17788, Lr: 0.000300\n", "2020-01-12 15:25:58,460 Epoch 6 Step: 45300 Batch Loss: 1.419662 Tokens per Sec: 16715, Lr: 0.000300\n", "2020-01-12 15:26:12,866 Epoch 6 Step: 45400 Batch Loss: 1.199331 Tokens per Sec: 17865, Lr: 0.000300\n", "2020-01-12 15:26:27,282 Epoch 6 Step: 45500 Batch Loss: 1.293400 Tokens per Sec: 17335, Lr: 0.000300\n", "2020-01-12 15:26:41,648 Epoch 6 Step: 45600 Batch Loss: 1.343354 Tokens per Sec: 17630, Lr: 0.000300\n", "2020-01-12 15:26:56,022 Epoch 6 Step: 45700 Batch Loss: 1.180933 Tokens per Sec: 17478, Lr: 0.000300\n", "2020-01-12 15:27:10,345 Epoch 6 Step: 45800 Batch Loss: 1.260049 Tokens per Sec: 17349, Lr: 0.000300\n", "2020-01-12 15:27:24,720 Epoch 6 Step: 45900 Batch Loss: 1.428155 Tokens per Sec: 17084, Lr: 0.000300\n", "2020-01-12 15:27:39,066 Epoch 6 Step: 46000 Batch Loss: 1.218810 Tokens per Sec: 17594, Lr: 0.000300\n", "2020-01-12 15:28:16,387 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:28:16,387 Saving new checkpoint.\n", "2020-01-12 15:28:16,774 Example #0\n", "2020-01-12 15:28:16,774 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:28:16,774 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:28:16,775 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxeka .\n", "2020-01-12 15:28:16,775 Example #1\n", "2020-01-12 15:28:16,775 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:28:16,775 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:28:16,776 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 15:28:16,776 Example #2\n", "2020-01-12 15:28:16,776 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:28:16,776 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:28:16,776 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , ku xiximeka , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:28:16,776 Example #3\n", "2020-01-12 15:28:16,776 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:28:16,776 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:28:16,776 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:28:16,776 Validation result (greedy) at epoch 6, step 46000: bleu: 33.81, loss: 36625.6719, ppl: 3.4565, duration: 37.7096s\n", "2020-01-12 15:28:31,217 Epoch 6 Step: 46100 Batch Loss: 1.154728 Tokens per Sec: 17346, Lr: 0.000300\n", "2020-01-12 15:28:45,586 Epoch 6 Step: 46200 Batch Loss: 1.371807 Tokens per Sec: 17458, Lr: 0.000300\n", "2020-01-12 15:28:59,841 Epoch 6 Step: 46300 Batch Loss: 1.229988 Tokens per Sec: 16876, Lr: 0.000300\n", "2020-01-12 15:29:14,142 Epoch 6 Step: 46400 Batch Loss: 1.398157 Tokens per Sec: 17289, Lr: 0.000300\n", "2020-01-12 15:29:28,501 Epoch 6 Step: 46500 Batch Loss: 1.229569 Tokens per Sec: 17451, Lr: 0.000300\n", "2020-01-12 15:29:42,921 Epoch 6 Step: 46600 Batch Loss: 1.619818 Tokens per Sec: 17376, Lr: 0.000300\n", "2020-01-12 15:29:57,199 Epoch 6 Step: 46700 Batch Loss: 1.304276 Tokens per Sec: 17254, Lr: 0.000300\n", "2020-01-12 15:30:11,505 Epoch 6 Step: 46800 Batch Loss: 1.387198 Tokens per Sec: 17371, Lr: 0.000300\n", "2020-01-12 15:30:25,915 Epoch 6 Step: 46900 Batch Loss: 1.251977 Tokens per Sec: 17077, Lr: 0.000300\n", "2020-01-12 15:30:40,360 Epoch 6 Step: 47000 Batch Loss: 1.399895 Tokens per Sec: 17818, Lr: 0.000300\n", "2020-01-12 15:31:17,663 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:31:17,663 Saving new checkpoint.\n", "2020-01-12 15:31:17,966 Example #0\n", "2020-01-12 15:31:17,967 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:31:17,967 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:31:17,967 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxeka .\n", "2020-01-12 15:31:17,967 Example #1\n", "2020-01-12 15:31:17,967 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:31:17,967 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:31:17,967 \tHypothesis: 3 “ Hi Ta Ku Tirhisa ”\n", "2020-01-12 15:31:17,967 Example #2\n", "2020-01-12 15:31:17,968 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:31:17,968 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:31:17,968 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama tiyeke yo kuma ntshunxeko , ku xiximiwa , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 15:31:17,968 Example #3\n", "2020-01-12 15:31:17,968 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:31:17,968 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:31:17,968 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:31:17,969 Validation result (greedy) at epoch 6, step 47000: bleu: 33.41, loss: 36611.3672, ppl: 3.4548, duration: 37.6083s\n", "2020-01-12 15:31:32,328 Epoch 6 Step: 47100 Batch Loss: 1.412104 Tokens per Sec: 17165, Lr: 0.000300\n", "2020-01-12 15:31:46,772 Epoch 6 Step: 47200 Batch Loss: 1.293237 Tokens per Sec: 17235, Lr: 0.000300\n", "2020-01-12 15:32:01,276 Epoch 6 Step: 47300 Batch Loss: 1.532695 Tokens per Sec: 17655, Lr: 0.000300\n", "2020-01-12 15:32:15,589 Epoch 6 Step: 47400 Batch Loss: 1.195363 Tokens per Sec: 17385, Lr: 0.000300\n", "2020-01-12 15:32:29,968 Epoch 6 Step: 47500 Batch Loss: 1.363247 Tokens per Sec: 17379, Lr: 0.000300\n", "2020-01-12 15:32:44,411 Epoch 6 Step: 47600 Batch Loss: 1.520974 Tokens per Sec: 17633, Lr: 0.000300\n", "2020-01-12 15:32:58,853 Epoch 6 Step: 47700 Batch Loss: 1.414441 Tokens per Sec: 17231, Lr: 0.000300\n", "2020-01-12 15:33:13,219 Epoch 6 Step: 47800 Batch Loss: 1.362321 Tokens per Sec: 17522, Lr: 0.000300\n", "2020-01-12 15:33:27,560 Epoch 6 Step: 47900 Batch Loss: 1.430865 Tokens per Sec: 17345, Lr: 0.000300\n", "2020-01-12 15:33:42,048 Epoch 6 Step: 48000 Batch Loss: 1.499731 Tokens per Sec: 17679, Lr: 0.000300\n", "2020-01-12 15:34:19,276 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:34:19,276 Saving new checkpoint.\n", "2020-01-12 15:34:19,610 Example #0\n", "2020-01-12 15:34:19,611 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:34:19,611 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:34:19,611 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 15:34:19,611 Example #1\n", "2020-01-12 15:34:19,611 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:34:19,611 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:34:19,611 \tHypothesis: 3 “ Hi Ku Tirhisa ”\n", "2020-01-12 15:34:19,611 Example #2\n", "2020-01-12 15:34:19,612 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:34:19,612 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:34:19,612 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo herisa ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:34:19,612 Example #3\n", "2020-01-12 15:34:19,612 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:34:19,612 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:34:19,612 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu naswona wu fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 15:34:19,612 Validation result (greedy) at epoch 6, step 48000: bleu: 34.24, loss: 36390.1172, ppl: 3.4290, duration: 37.5643s\n", "2020-01-12 15:34:33,929 Epoch 6 Step: 48100 Batch Loss: 1.295433 Tokens per Sec: 17370, Lr: 0.000300\n", "2020-01-12 15:34:48,303 Epoch 6 Step: 48200 Batch Loss: 1.385883 Tokens per Sec: 17569, Lr: 0.000300\n", "2020-01-12 15:35:02,553 Epoch 6 Step: 48300 Batch Loss: 1.192043 Tokens per Sec: 17188, Lr: 0.000300\n", "2020-01-12 15:35:16,939 Epoch 6 Step: 48400 Batch Loss: 1.325741 Tokens per Sec: 17894, Lr: 0.000300\n", "2020-01-12 15:35:31,260 Epoch 6 Step: 48500 Batch Loss: 1.431946 Tokens per Sec: 17519, Lr: 0.000300\n", "2020-01-12 15:35:45,582 Epoch 6 Step: 48600 Batch Loss: 1.365553 Tokens per Sec: 17426, Lr: 0.000300\n", "2020-01-12 15:35:59,874 Epoch 6 Step: 48700 Batch Loss: 1.245690 Tokens per Sec: 17546, Lr: 0.000300\n", "2020-01-12 15:36:14,259 Epoch 6 Step: 48800 Batch Loss: 1.543017 Tokens per Sec: 17625, Lr: 0.000300\n", "2020-01-12 15:36:28,508 Epoch 6 Step: 48900 Batch Loss: 1.342993 Tokens per Sec: 17206, Lr: 0.000300\n", "2020-01-12 15:36:42,827 Epoch 6 Step: 49000 Batch Loss: 1.342685 Tokens per Sec: 17552, Lr: 0.000300\n", "2020-01-12 15:37:20,094 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:37:20,094 Saving new checkpoint.\n", "2020-01-12 15:37:20,434 Example #0\n", "2020-01-12 15:37:20,435 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:37:20,435 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:37:20,435 \tHypothesis: Kambe , hi ntolovelo hi lava ku ntshunxeka .\n", "2020-01-12 15:37:20,435 Example #1\n", "2020-01-12 15:37:20,435 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:37:20,436 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:37:20,436 \tHypothesis: 3 “ Hi Ku Lavisisa ”\n", "2020-01-12 15:37:20,436 Example #2\n", "2020-01-12 15:37:20,436 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:37:20,436 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:37:20,436 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni ku chava , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:37:20,436 Example #3\n", "2020-01-12 15:37:20,436 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:37:20,437 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:37:20,437 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu naswona wu teka ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 15:37:20,437 Validation result (greedy) at epoch 6, step 49000: bleu: 33.35, loss: 36214.1836, ppl: 3.4086, duration: 37.6094s\n", "2020-01-12 15:37:34,734 Epoch 6 Step: 49100 Batch Loss: 1.502014 Tokens per Sec: 17252, Lr: 0.000300\n", "2020-01-12 15:37:49,135 Epoch 6 Step: 49200 Batch Loss: 1.348871 Tokens per Sec: 17352, Lr: 0.000300\n", "2020-01-12 15:38:03,522 Epoch 6 Step: 49300 Batch Loss: 1.250001 Tokens per Sec: 17415, Lr: 0.000300\n", "2020-01-12 15:38:17,830 Epoch 6 Step: 49400 Batch Loss: 1.551826 Tokens per Sec: 17805, Lr: 0.000300\n", "2020-01-12 15:38:32,255 Epoch 6 Step: 49500 Batch Loss: 1.530701 Tokens per Sec: 17629, Lr: 0.000300\n", "2020-01-12 15:38:46,657 Epoch 6 Step: 49600 Batch Loss: 1.349638 Tokens per Sec: 17318, Lr: 0.000300\n", "2020-01-12 15:39:01,068 Epoch 6 Step: 49700 Batch Loss: 1.257083 Tokens per Sec: 17577, Lr: 0.000300\n", "2020-01-12 15:39:15,374 Epoch 6 Step: 49800 Batch Loss: 1.274286 Tokens per Sec: 17293, Lr: 0.000300\n", "2020-01-12 15:39:29,750 Epoch 6 Step: 49900 Batch Loss: 1.210282 Tokens per Sec: 17389, Lr: 0.000300\n", "2020-01-12 15:39:44,053 Epoch 6 Step: 50000 Batch Loss: 1.363310 Tokens per Sec: 17157, Lr: 0.000300\n", "2020-01-12 15:40:21,243 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:40:21,243 Saving new checkpoint.\n", "2020-01-12 15:40:21,556 Example #0\n", "2020-01-12 15:40:21,557 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:40:21,557 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:40:21,557 \tHypothesis: Kambe , hi ntumbuluko , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 15:40:21,557 Example #1\n", "2020-01-12 15:40:21,557 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:40:21,557 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:40:21,558 \tHypothesis: 3 “ Hi Famba Hi Ku Tirhisa ”\n", "2020-01-12 15:40:21,558 Example #2\n", "2020-01-12 15:40:21,558 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:40:21,558 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:40:21,558 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama tirhaka hi matimba yo herisa ntshunxeko , ku xiximeka , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:40:21,558 Example #3\n", "2020-01-12 15:40:21,558 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:40:21,559 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:40:21,559 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu naswona wu teka ku va mutirheli tanihi mutirheli\n", "2020-01-12 15:40:21,559 Validation result (greedy) at epoch 6, step 50000: bleu: 33.80, loss: 35913.8828, ppl: 3.3741, duration: 37.5052s\n", "2020-01-12 15:40:35,893 Epoch 6 Step: 50100 Batch Loss: 1.298641 Tokens per Sec: 17386, Lr: 0.000300\n", "2020-01-12 15:40:50,193 Epoch 6 Step: 50200 Batch Loss: 1.254532 Tokens per Sec: 16984, Lr: 0.000300\n", "2020-01-12 15:41:04,654 Epoch 6 Step: 50300 Batch Loss: 1.373006 Tokens per Sec: 17298, Lr: 0.000300\n", "2020-01-12 15:41:18,983 Epoch 6 Step: 50400 Batch Loss: 1.400129 Tokens per Sec: 17323, Lr: 0.000300\n", "2020-01-12 15:41:33,310 Epoch 6 Step: 50500 Batch Loss: 1.575600 Tokens per Sec: 17544, Lr: 0.000300\n", "2020-01-12 15:41:47,753 Epoch 6 Step: 50600 Batch Loss: 1.375535 Tokens per Sec: 17460, Lr: 0.000300\n", "2020-01-12 15:42:02,119 Epoch 6 Step: 50700 Batch Loss: 1.403999 Tokens per Sec: 17235, Lr: 0.000300\n", "2020-01-12 15:42:16,515 Epoch 6 Step: 50800 Batch Loss: 1.621812 Tokens per Sec: 17442, Lr: 0.000300\n", "2020-01-12 15:42:30,840 Epoch 6 Step: 50900 Batch Loss: 1.203022 Tokens per Sec: 17509, Lr: 0.000300\n", "2020-01-12 15:42:45,227 Epoch 6 Step: 51000 Batch Loss: 1.428470 Tokens per Sec: 17306, Lr: 0.000300\n", "2020-01-12 15:43:22,446 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:43:22,446 Saving new checkpoint.\n", "2020-01-12 15:43:22,810 Example #0\n", "2020-01-12 15:43:22,810 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:43:22,810 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:43:22,810 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxiwa .\n", "2020-01-12 15:43:22,810 Example #1\n", "2020-01-12 15:43:22,810 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:43:22,810 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:43:22,811 \tHypothesis: 3 “ Hi Ku Tiyisela ”\n", "2020-01-12 15:43:22,811 Example #2\n", "2020-01-12 15:43:22,811 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:43:22,811 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:43:22,811 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximeka , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:43:22,811 Example #3\n", "2020-01-12 15:43:22,811 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:43:22,811 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:43:22,811 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:43:22,812 Validation result (greedy) at epoch 6, step 51000: bleu: 34.03, loss: 35825.2188, ppl: 3.3640, duration: 37.5843s\n", "2020-01-12 15:43:37,103 Epoch 6 Step: 51100 Batch Loss: 1.279621 Tokens per Sec: 17334, Lr: 0.000300\n", "2020-01-12 15:43:51,518 Epoch 6 Step: 51200 Batch Loss: 1.480877 Tokens per Sec: 17648, Lr: 0.000300\n", "2020-01-12 15:44:05,791 Epoch 6 Step: 51300 Batch Loss: 1.346743 Tokens per Sec: 17399, Lr: 0.000300\n", "2020-01-12 15:44:20,099 Epoch 6 Step: 51400 Batch Loss: 1.487468 Tokens per Sec: 17203, Lr: 0.000300\n", "2020-01-12 15:44:34,393 Epoch 6 Step: 51500 Batch Loss: 1.274932 Tokens per Sec: 17359, Lr: 0.000300\n", "2020-01-12 15:44:48,725 Epoch 6 Step: 51600 Batch Loss: 1.310887 Tokens per Sec: 17732, Lr: 0.000300\n", "2020-01-12 15:45:03,188 Epoch 6 Step: 51700 Batch Loss: 1.381442 Tokens per Sec: 17871, Lr: 0.000300\n", "2020-01-12 15:45:17,519 Epoch 6 Step: 51800 Batch Loss: 1.318413 Tokens per Sec: 17167, Lr: 0.000300\n", "2020-01-12 15:45:31,922 Epoch 6 Step: 51900 Batch Loss: 1.231434 Tokens per Sec: 17632, Lr: 0.000300\n", "2020-01-12 15:45:46,314 Epoch 6 Step: 52000 Batch Loss: 1.502214 Tokens per Sec: 17831, Lr: 0.000300\n", "2020-01-12 15:46:23,690 Example #0\n", "2020-01-12 15:46:23,690 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:46:23,690 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:46:23,690 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxeka .\n", "2020-01-12 15:46:23,691 Example #1\n", "2020-01-12 15:46:23,691 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:46:23,691 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:46:23,691 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 15:46:23,691 Example #2\n", "2020-01-12 15:46:23,691 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:46:23,691 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:46:23,691 \tHypothesis: Loko hi tshika swiendlo swa hina swo biha , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , ku xiximeka , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 15:46:23,691 Example #3\n", "2020-01-12 15:46:23,692 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:46:23,692 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:46:23,692 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu naswona wu teka ku ri mutirheli tanihi mutirheli\n", "2020-01-12 15:46:23,692 Validation result (greedy) at epoch 6, step 52000: bleu: 33.77, loss: 35829.1953, ppl: 3.3645, duration: 37.3779s\n", "2020-01-12 15:46:37,985 Epoch 6 Step: 52100 Batch Loss: 1.271407 Tokens per Sec: 17473, Lr: 0.000300\n", "2020-01-12 15:46:52,321 Epoch 6 Step: 52200 Batch Loss: 1.245620 Tokens per Sec: 17362, Lr: 0.000300\n", "2020-01-12 15:47:06,771 Epoch 6 Step: 52300 Batch Loss: 1.424373 Tokens per Sec: 17941, Lr: 0.000300\n", "2020-01-12 15:47:21,148 Epoch 6 Step: 52400 Batch Loss: 1.250396 Tokens per Sec: 17361, Lr: 0.000300\n", "2020-01-12 15:47:35,434 Epoch 6 Step: 52500 Batch Loss: 1.342838 Tokens per Sec: 17237, Lr: 0.000300\n", "2020-01-12 15:47:49,893 Epoch 6 Step: 52600 Batch Loss: 1.536923 Tokens per Sec: 17642, Lr: 0.000300\n", "2020-01-12 15:48:04,215 Epoch 6 Step: 52700 Batch Loss: 1.319900 Tokens per Sec: 17256, Lr: 0.000300\n", "2020-01-12 15:48:18,595 Epoch 6 Step: 52800 Batch Loss: 1.437133 Tokens per Sec: 17753, Lr: 0.000300\n", "2020-01-12 15:48:32,968 Epoch 6 Step: 52900 Batch Loss: 1.308154 Tokens per Sec: 17479, Lr: 0.000300\n", "2020-01-12 15:48:47,392 Epoch 6 Step: 53000 Batch Loss: 1.482153 Tokens per Sec: 17419, Lr: 0.000300\n", "2020-01-12 15:49:24,710 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:49:24,710 Saving new checkpoint.\n", "2020-01-12 15:49:25,027 Example #0\n", "2020-01-12 15:49:25,027 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:49:25,027 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:49:25,027 \tHypothesis: Kambe , hi ntolovelo , hi lava ku phalala .\n", "2020-01-12 15:49:25,028 Example #1\n", "2020-01-12 15:49:25,028 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:49:25,029 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:49:25,029 \tHypothesis: 3 “ Hi Ku Lavisisa ”\n", "2020-01-12 15:49:25,029 Example #2\n", "2020-01-12 15:49:25,029 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:49:25,029 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:49:25,029 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximeka , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:49:25,029 Example #3\n", "2020-01-12 15:49:25,029 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:49:25,029 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:49:25,029 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:49:25,030 Validation result (greedy) at epoch 6, step 53000: bleu: 33.86, loss: 35491.4336, ppl: 3.3262, duration: 37.6372s\n", "2020-01-12 15:49:39,355 Epoch 6 Step: 53100 Batch Loss: 1.862130 Tokens per Sec: 17470, Lr: 0.000300\n", "2020-01-12 15:49:53,732 Epoch 6 Step: 53200 Batch Loss: 1.259164 Tokens per Sec: 17255, Lr: 0.000300\n", "2020-01-12 15:50:08,073 Epoch 6 Step: 53300 Batch Loss: 1.387532 Tokens per Sec: 17512, Lr: 0.000300\n", "2020-01-12 15:50:22,479 Epoch 6 Step: 53400 Batch Loss: 1.255235 Tokens per Sec: 17495, Lr: 0.000300\n", "2020-01-12 15:50:36,773 Epoch 6 Step: 53500 Batch Loss: 1.186903 Tokens per Sec: 17235, Lr: 0.000300\n", "2020-01-12 15:50:51,182 Epoch 6 Step: 53600 Batch Loss: 1.364636 Tokens per Sec: 17678, Lr: 0.000300\n", "2020-01-12 15:51:05,556 Epoch 6 Step: 53700 Batch Loss: 1.352661 Tokens per Sec: 17352, Lr: 0.000300\n", "2020-01-12 15:51:19,901 Epoch 6 Step: 53800 Batch Loss: 1.391301 Tokens per Sec: 17573, Lr: 0.000300\n", "2020-01-12 15:51:21,529 Epoch 6: total training loss 12383.15\n", "2020-01-12 15:51:21,530 EPOCH 7\n", "2020-01-12 15:51:35,567 Epoch 7 Step: 53900 Batch Loss: 1.329164 Tokens per Sec: 16107, Lr: 0.000300\n", "2020-01-12 15:51:49,975 Epoch 7 Step: 54000 Batch Loss: 1.159281 Tokens per Sec: 17106, Lr: 0.000300\n", "2020-01-12 15:52:27,277 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:52:27,277 Saving new checkpoint.\n", "2020-01-12 15:52:27,624 Example #0\n", "2020-01-12 15:52:27,624 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:52:27,624 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:52:27,624 \tHypothesis: Kambe , hi ntumbuluko hi lava ku ntshunxeka .\n", "2020-01-12 15:52:27,625 Example #1\n", "2020-01-12 15:52:27,625 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:52:27,625 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:52:27,625 \tHypothesis: 3 “ Hi Ku Lavisisa ”\n", "2020-01-12 15:52:27,625 Example #2\n", "2020-01-12 15:52:27,625 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:52:27,625 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:52:27,625 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:52:27,625 Example #3\n", "2020-01-12 15:52:27,626 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:52:27,626 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:52:27,626 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu ni ku fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 15:52:27,626 Validation result (greedy) at epoch 7, step 54000: bleu: 34.58, loss: 35431.9414, ppl: 3.3195, duration: 37.6509s\n", "2020-01-12 15:52:41,945 Epoch 7 Step: 54100 Batch Loss: 1.264197 Tokens per Sec: 17628, Lr: 0.000300\n", "2020-01-12 15:52:56,216 Epoch 7 Step: 54200 Batch Loss: 1.658572 Tokens per Sec: 16984, Lr: 0.000300\n", "2020-01-12 15:53:10,541 Epoch 7 Step: 54300 Batch Loss: 1.594602 Tokens per Sec: 17397, Lr: 0.000300\n", "2020-01-12 15:53:24,854 Epoch 7 Step: 54400 Batch Loss: 1.514920 Tokens per Sec: 17307, Lr: 0.000300\n", "2020-01-12 15:53:39,118 Epoch 7 Step: 54500 Batch Loss: 1.138659 Tokens per Sec: 17200, Lr: 0.000300\n", "2020-01-12 15:53:53,455 Epoch 7 Step: 54600 Batch Loss: 1.430985 Tokens per Sec: 17657, Lr: 0.000300\n", "2020-01-12 15:54:07,829 Epoch 7 Step: 54700 Batch Loss: 1.338993 Tokens per Sec: 17813, Lr: 0.000300\n", "2020-01-12 15:54:22,224 Epoch 7 Step: 54800 Batch Loss: 1.385051 Tokens per Sec: 17220, Lr: 0.000300\n", "2020-01-12 15:54:36,688 Epoch 7 Step: 54900 Batch Loss: 1.472373 Tokens per Sec: 18178, Lr: 0.000300\n", "2020-01-12 15:54:51,089 Epoch 7 Step: 55000 Batch Loss: 1.279779 Tokens per Sec: 17462, Lr: 0.000300\n", "2020-01-12 15:55:28,416 Example #0\n", "2020-01-12 15:55:28,417 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:55:28,417 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:55:28,417 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxeka .\n", "2020-01-12 15:55:28,417 Example #1\n", "2020-01-12 15:55:28,417 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:55:28,417 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:55:28,418 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 15:55:28,418 Example #2\n", "2020-01-12 15:55:28,418 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:55:28,418 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:55:28,418 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama fanaka yo herisa ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:55:28,418 Example #3\n", "2020-01-12 15:55:28,418 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:55:28,418 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:55:28,419 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 15:55:28,419 Validation result (greedy) at epoch 7, step 55000: bleu: 34.31, loss: 35645.2227, ppl: 3.3436, duration: 37.3294s\n", "2020-01-12 15:55:42,895 Epoch 7 Step: 55100 Batch Loss: 1.210796 Tokens per Sec: 17703, Lr: 0.000300\n", "2020-01-12 15:55:57,348 Epoch 7 Step: 55200 Batch Loss: 1.311736 Tokens per Sec: 17145, Lr: 0.000300\n", "2020-01-12 15:56:11,633 Epoch 7 Step: 55300 Batch Loss: 1.478075 Tokens per Sec: 17705, Lr: 0.000300\n", "2020-01-12 15:56:26,037 Epoch 7 Step: 55400 Batch Loss: 1.215440 Tokens per Sec: 17147, Lr: 0.000300\n", "2020-01-12 15:56:40,487 Epoch 7 Step: 55500 Batch Loss: 1.323566 Tokens per Sec: 17842, Lr: 0.000300\n", "2020-01-12 15:56:54,916 Epoch 7 Step: 55600 Batch Loss: 1.391615 Tokens per Sec: 17695, Lr: 0.000300\n", "2020-01-12 15:57:09,351 Epoch 7 Step: 55700 Batch Loss: 1.258518 Tokens per Sec: 17153, Lr: 0.000300\n", "2020-01-12 15:57:23,801 Epoch 7 Step: 55800 Batch Loss: 1.597507 Tokens per Sec: 17543, Lr: 0.000300\n", "2020-01-12 15:57:38,181 Epoch 7 Step: 55900 Batch Loss: 1.381259 Tokens per Sec: 17337, Lr: 0.000300\n", "2020-01-12 15:57:52,634 Epoch 7 Step: 56000 Batch Loss: 1.254975 Tokens per Sec: 17426, Lr: 0.000300\n", "2020-01-12 15:58:29,696 Hooray! New best validation result [ppl]!\n", "2020-01-12 15:58:29,696 Saving new checkpoint.\n", "2020-01-12 15:58:29,982 Example #0\n", "2020-01-12 15:58:29,982 \tSource: But , naturally , we also want relief .\n", "2020-01-12 15:58:29,982 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 15:58:29,982 \tHypothesis: Kambe , hi ntumbuluko , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 15:58:29,982 Example #1\n", "2020-01-12 15:58:29,982 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 15:58:29,982 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 15:58:29,983 \tHypothesis: 3 “ Hi Ku Langutana Na Wena ”\n", "2020-01-12 15:58:29,983 Example #2\n", "2020-01-12 15:58:29,983 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 15:58:29,983 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 15:58:29,983 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 15:58:29,983 Example #3\n", "2020-01-12 15:58:29,983 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 15:58:29,983 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 15:58:29,983 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu naswona wu fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 15:58:29,983 Validation result (greedy) at epoch 7, step 56000: bleu: 34.36, loss: 35330.9062, ppl: 3.3082, duration: 37.3490s\n", "2020-01-12 15:58:44,301 Epoch 7 Step: 56100 Batch Loss: 1.423456 Tokens per Sec: 17661, Lr: 0.000300\n", "2020-01-12 15:58:58,627 Epoch 7 Step: 56200 Batch Loss: 1.329918 Tokens per Sec: 17555, Lr: 0.000300\n", "2020-01-12 15:59:12,878 Epoch 7 Step: 56300 Batch Loss: 1.269975 Tokens per Sec: 17498, Lr: 0.000300\n", "2020-01-12 15:59:27,249 Epoch 7 Step: 56400 Batch Loss: 1.363827 Tokens per Sec: 17989, Lr: 0.000300\n", "2020-01-12 15:59:41,616 Epoch 7 Step: 56500 Batch Loss: 1.378385 Tokens per Sec: 17542, Lr: 0.000300\n", "2020-01-12 15:59:55,951 Epoch 7 Step: 56600 Batch Loss: 1.476383 Tokens per Sec: 17396, Lr: 0.000300\n", "2020-01-12 16:00:10,197 Epoch 7 Step: 56700 Batch Loss: 1.315426 Tokens per Sec: 17137, Lr: 0.000300\n", "2020-01-12 16:00:24,542 Epoch 7 Step: 56800 Batch Loss: 1.548531 Tokens per Sec: 17572, Lr: 0.000300\n", "2020-01-12 16:00:38,826 Epoch 7 Step: 56900 Batch Loss: 1.256970 Tokens per Sec: 17434, Lr: 0.000300\n", "2020-01-12 16:00:53,146 Epoch 7 Step: 57000 Batch Loss: 1.506114 Tokens per Sec: 17470, Lr: 0.000300\n", "2020-01-12 16:01:30,333 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:01:30,333 Saving new checkpoint.\n", "2020-01-12 16:01:30,682 Example #0\n", "2020-01-12 16:01:30,683 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:01:30,683 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:01:30,683 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxeka .\n", "2020-01-12 16:01:30,683 Example #1\n", "2020-01-12 16:01:30,684 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:01:30,684 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:01:30,684 \tHypothesis: 3 “ Hi Ku Langutana Na Wena ”\n", "2020-01-12 16:01:30,684 Example #2\n", "2020-01-12 16:01:30,684 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:01:30,684 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:01:30,685 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama engetelekeke yo kuma ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:01:30,685 Example #3\n", "2020-01-12 16:01:30,685 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:01:30,685 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:01:30,685 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu naswona wu fungha ku fuma ka munhu tanihi mutirheli\n", "2020-01-12 16:01:30,685 Validation result (greedy) at epoch 7, step 57000: bleu: 34.33, loss: 35191.9844, ppl: 3.2927, duration: 37.5389s\n", "2020-01-12 16:01:45,112 Epoch 7 Step: 57100 Batch Loss: 1.314552 Tokens per Sec: 17291, Lr: 0.000300\n", "2020-01-12 16:01:59,437 Epoch 7 Step: 57200 Batch Loss: 1.400666 Tokens per Sec: 17462, Lr: 0.000300\n", "2020-01-12 16:02:13,737 Epoch 7 Step: 57300 Batch Loss: 1.258063 Tokens per Sec: 17257, Lr: 0.000300\n", "2020-01-12 16:02:28,125 Epoch 7 Step: 57400 Batch Loss: 1.232746 Tokens per Sec: 17308, Lr: 0.000300\n", "2020-01-12 16:02:42,441 Epoch 7 Step: 57500 Batch Loss: 1.314510 Tokens per Sec: 17371, Lr: 0.000300\n", "2020-01-12 16:02:56,750 Epoch 7 Step: 57600 Batch Loss: 1.261124 Tokens per Sec: 17424, Lr: 0.000300\n", "2020-01-12 16:03:11,044 Epoch 7 Step: 57700 Batch Loss: 1.357527 Tokens per Sec: 17166, Lr: 0.000300\n", "2020-01-12 16:03:25,512 Epoch 7 Step: 57800 Batch Loss: 1.112488 Tokens per Sec: 17538, Lr: 0.000300\n", "2020-01-12 16:03:39,824 Epoch 7 Step: 57900 Batch Loss: 1.051521 Tokens per Sec: 17550, Lr: 0.000300\n", "2020-01-12 16:03:54,211 Epoch 7 Step: 58000 Batch Loss: 1.376123 Tokens per Sec: 17319, Lr: 0.000300\n", "2020-01-12 16:04:31,486 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:04:31,486 Saving new checkpoint.\n", "2020-01-12 16:04:31,776 Example #0\n", "2020-01-12 16:04:31,776 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:04:31,776 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:04:31,776 \tHypothesis: Kambe , hi ntolovelo , hi lava ku phalala .\n", "2020-01-12 16:04:31,777 Example #1\n", "2020-01-12 16:04:31,777 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:04:31,777 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:04:31,777 \tHypothesis: 3 “ Hi Ku Tirhisa ”\n", "2020-01-12 16:04:31,777 Example #2\n", "2020-01-12 16:04:31,777 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:04:31,777 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:04:31,777 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku va ni vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:04:31,778 Example #3\n", "2020-01-12 16:04:31,778 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:04:31,778 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:04:31,778 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu naswona wu fungha ku va mutirheli\n", "2020-01-12 16:04:31,778 Validation result (greedy) at epoch 7, step 58000: bleu: 35.06, loss: 34966.1055, ppl: 3.2676, duration: 37.5663s\n", "2020-01-12 16:04:46,148 Epoch 7 Step: 58100 Batch Loss: 1.464739 Tokens per Sec: 17316, Lr: 0.000300\n", "2020-01-12 16:05:00,459 Epoch 7 Step: 58200 Batch Loss: 1.402646 Tokens per Sec: 17375, Lr: 0.000300\n", "2020-01-12 16:05:14,764 Epoch 7 Step: 58300 Batch Loss: 1.210747 Tokens per Sec: 17276, Lr: 0.000300\n", "2020-01-12 16:05:29,163 Epoch 7 Step: 58400 Batch Loss: 1.247718 Tokens per Sec: 17516, Lr: 0.000300\n", "2020-01-12 16:05:43,468 Epoch 7 Step: 58500 Batch Loss: 1.380952 Tokens per Sec: 17308, Lr: 0.000300\n", "2020-01-12 16:05:57,869 Epoch 7 Step: 58600 Batch Loss: 1.368387 Tokens per Sec: 17172, Lr: 0.000300\n", "2020-01-12 16:06:12,113 Epoch 7 Step: 58700 Batch Loss: 1.336550 Tokens per Sec: 17141, Lr: 0.000300\n", "2020-01-12 16:06:26,462 Epoch 7 Step: 58800 Batch Loss: 1.258526 Tokens per Sec: 17431, Lr: 0.000300\n", "2020-01-12 16:06:40,873 Epoch 7 Step: 58900 Batch Loss: 1.067553 Tokens per Sec: 17410, Lr: 0.000300\n", "2020-01-12 16:06:55,316 Epoch 7 Step: 59000 Batch Loss: 1.365426 Tokens per Sec: 17763, Lr: 0.000300\n", "2020-01-12 16:07:32,403 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:07:32,403 Saving new checkpoint.\n", "2020-01-12 16:07:32,738 Example #0\n", "2020-01-12 16:07:32,739 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:07:32,739 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:07:32,739 \tHypothesis: Kambe , hi ntolovelo , hi lava ku phalala .\n", "2020-01-12 16:07:32,739 Example #1\n", "2020-01-12 16:07:32,739 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:07:32,739 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:07:32,741 \tHypothesis: 3 “ Hi Ku Tirhisa ”\n", "2020-01-12 16:07:32,741 Example #2\n", "2020-01-12 16:07:32,741 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:07:32,741 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:07:32,741 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:07:32,741 Example #3\n", "2020-01-12 16:07:32,741 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:07:32,741 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:07:32,742 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu naswona wu fungha ku va mutirheli\n", "2020-01-12 16:07:32,742 Validation result (greedy) at epoch 7, step 59000: bleu: 34.94, loss: 34941.9766, ppl: 3.2649, duration: 37.4252s\n", "2020-01-12 16:07:47,046 Epoch 7 Step: 59100 Batch Loss: 1.340600 Tokens per Sec: 17300, Lr: 0.000300\n", "2020-01-12 16:08:01,437 Epoch 7 Step: 59200 Batch Loss: 1.368596 Tokens per Sec: 17843, Lr: 0.000300\n", "2020-01-12 16:08:15,674 Epoch 7 Step: 59300 Batch Loss: 1.458361 Tokens per Sec: 17615, Lr: 0.000300\n", "2020-01-12 16:08:30,024 Epoch 7 Step: 59400 Batch Loss: 1.194110 Tokens per Sec: 17335, Lr: 0.000300\n", "2020-01-12 16:08:44,372 Epoch 7 Step: 59500 Batch Loss: 1.432519 Tokens per Sec: 17401, Lr: 0.000300\n", "2020-01-12 16:08:58,703 Epoch 7 Step: 59600 Batch Loss: 1.365334 Tokens per Sec: 17268, Lr: 0.000300\n", "2020-01-12 16:09:12,970 Epoch 7 Step: 59700 Batch Loss: 1.374937 Tokens per Sec: 17276, Lr: 0.000300\n", "2020-01-12 16:09:27,294 Epoch 7 Step: 59800 Batch Loss: 1.349800 Tokens per Sec: 17180, Lr: 0.000300\n", "2020-01-12 16:09:41,601 Epoch 7 Step: 59900 Batch Loss: 1.234746 Tokens per Sec: 17488, Lr: 0.000300\n", "2020-01-12 16:09:55,901 Epoch 7 Step: 60000 Batch Loss: 1.236807 Tokens per Sec: 17258, Lr: 0.000300\n", "2020-01-12 16:10:33,021 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:10:33,021 Saving new checkpoint.\n", "2020-01-12 16:10:33,310 Example #0\n", "2020-01-12 16:10:33,311 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:10:33,311 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:10:33,311 \tHypothesis: Kambe , hi ntolovelo , na hina hi lava ku ntshunxiwa .\n", "2020-01-12 16:10:33,311 Example #1\n", "2020-01-12 16:10:33,311 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:10:33,311 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:10:33,311 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 16:10:33,311 Example #2\n", "2020-01-12 16:10:33,311 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:10:33,312 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:10:33,312 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi tikarhatela ku kuma ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:10:33,312 Example #3\n", "2020-01-12 16:10:33,312 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:10:33,312 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:10:33,313 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu naswona wu fungha ku va mutirheli\n", "2020-01-12 16:10:33,313 Validation result (greedy) at epoch 7, step 60000: bleu: 34.70, loss: 34747.3125, ppl: 3.2435, duration: 37.4115s\n", "2020-01-12 16:10:47,571 Epoch 7 Step: 60100 Batch Loss: 1.284963 Tokens per Sec: 17576, Lr: 0.000300\n", "2020-01-12 16:11:01,903 Epoch 7 Step: 60200 Batch Loss: 1.674617 Tokens per Sec: 17436, Lr: 0.000300\n", "2020-01-12 16:11:16,159 Epoch 7 Step: 60300 Batch Loss: 1.328425 Tokens per Sec: 17583, Lr: 0.000300\n", "2020-01-12 16:11:30,496 Epoch 7 Step: 60400 Batch Loss: 1.232200 Tokens per Sec: 17514, Lr: 0.000300\n", "2020-01-12 16:11:44,799 Epoch 7 Step: 60500 Batch Loss: 1.359170 Tokens per Sec: 17793, Lr: 0.000300\n", "2020-01-12 16:11:59,212 Epoch 7 Step: 60600 Batch Loss: 1.303009 Tokens per Sec: 17397, Lr: 0.000300\n", "2020-01-12 16:12:13,488 Epoch 7 Step: 60700 Batch Loss: 1.384751 Tokens per Sec: 17689, Lr: 0.000300\n", "2020-01-12 16:12:27,795 Epoch 7 Step: 60800 Batch Loss: 1.261344 Tokens per Sec: 17238, Lr: 0.000300\n", "2020-01-12 16:12:42,151 Epoch 7 Step: 60900 Batch Loss: 1.341329 Tokens per Sec: 17602, Lr: 0.000300\n", "2020-01-12 16:12:56,504 Epoch 7 Step: 61000 Batch Loss: 1.133342 Tokens per Sec: 17514, Lr: 0.000300\n", "2020-01-12 16:13:33,610 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:13:33,610 Saving new checkpoint.\n", "2020-01-12 16:13:33,893 Example #0\n", "2020-01-12 16:13:33,894 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:13:33,894 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:13:33,894 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxeka .\n", "2020-01-12 16:13:33,894 Example #1\n", "2020-01-12 16:13:33,894 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:13:33,894 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:13:33,894 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 16:13:33,895 Example #2\n", "2020-01-12 16:13:33,895 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:13:33,895 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:13:33,895 \tHypothesis: Loko hi tshika swiendlo swa hina swo biha , hi fanele hi va ni matshalatshala lama engetelekeke yo kurisa ntshunxeko , vululami , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:13:33,895 Example #3\n", "2020-01-12 16:13:33,895 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:13:33,895 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:13:33,896 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu naswona wu fungha ku hlawuriwa ka munhu tanihi mutirheli\n", "2020-01-12 16:13:33,896 Validation result (greedy) at epoch 7, step 61000: bleu: 35.06, loss: 34724.3633, ppl: 3.2409, duration: 37.3915s\n", "2020-01-12 16:13:48,309 Epoch 7 Step: 61100 Batch Loss: 1.578110 Tokens per Sec: 17182, Lr: 0.000300\n", "2020-01-12 16:14:02,631 Epoch 7 Step: 61200 Batch Loss: 1.236426 Tokens per Sec: 17634, Lr: 0.000300\n", "2020-01-12 16:14:16,943 Epoch 7 Step: 61300 Batch Loss: 1.172725 Tokens per Sec: 17652, Lr: 0.000300\n", "2020-01-12 16:14:31,261 Epoch 7 Step: 61400 Batch Loss: 1.234359 Tokens per Sec: 17113, Lr: 0.000300\n", "2020-01-12 16:14:45,554 Epoch 7 Step: 61500 Batch Loss: 1.260785 Tokens per Sec: 17515, Lr: 0.000300\n", "2020-01-12 16:14:59,868 Epoch 7 Step: 61600 Batch Loss: 1.995620 Tokens per Sec: 17382, Lr: 0.000300\n", "2020-01-12 16:15:14,171 Epoch 7 Step: 61700 Batch Loss: 1.263772 Tokens per Sec: 17622, Lr: 0.000300\n", "2020-01-12 16:15:28,441 Epoch 7 Step: 61800 Batch Loss: 1.190562 Tokens per Sec: 17153, Lr: 0.000300\n", "2020-01-12 16:15:42,780 Epoch 7 Step: 61900 Batch Loss: 1.430600 Tokens per Sec: 17676, Lr: 0.000300\n", "2020-01-12 16:15:57,135 Epoch 7 Step: 62000 Batch Loss: 1.444077 Tokens per Sec: 17271, Lr: 0.000300\n", "2020-01-12 16:16:34,320 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:16:34,321 Saving new checkpoint.\n", "2020-01-12 16:16:34,616 Example #0\n", "2020-01-12 16:16:34,617 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:16:34,617 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:16:34,617 \tHypothesis: Kambe , hi ntolovelo , hi lava ku ntshunxeka .\n", "2020-01-12 16:16:34,617 Example #1\n", "2020-01-12 16:16:34,617 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:16:34,617 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:16:34,617 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 16:16:34,617 Example #2\n", "2020-01-12 16:16:34,617 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:16:34,617 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:16:34,617 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo biha , hi fanele hi tikarhatela ku nyika ntshunxeko , ku xiximeka , vululami ni mahanyelo ya vanhu .\n", "2020-01-12 16:16:34,618 Example #3\n", "2020-01-12 16:16:34,618 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:16:34,618 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:16:34,618 \tHypothesis: Nkhuvulo wu fanekisela ku tinyiketela eka Xikwembu ni ku fungha ku va mutirheli\n", "2020-01-12 16:16:34,618 Validation result (greedy) at epoch 7, step 62000: bleu: 35.65, loss: 34700.9062, ppl: 3.2384, duration: 37.4828s\n", "2020-01-12 16:16:48,949 Epoch 7 Step: 62100 Batch Loss: 1.189203 Tokens per Sec: 17468, Lr: 0.000300\n", "2020-01-12 16:17:03,369 Epoch 7 Step: 62200 Batch Loss: 1.176107 Tokens per Sec: 17499, Lr: 0.000300\n", "2020-01-12 16:17:17,746 Epoch 7 Step: 62300 Batch Loss: 1.206925 Tokens per Sec: 17426, Lr: 0.000300\n", "2020-01-12 16:17:32,121 Epoch 7 Step: 62400 Batch Loss: 1.254429 Tokens per Sec: 17478, Lr: 0.000300\n", "2020-01-12 16:17:46,478 Epoch 7 Step: 62500 Batch Loss: 1.291152 Tokens per Sec: 17555, Lr: 0.000300\n", "2020-01-12 16:18:00,789 Epoch 7 Step: 62600 Batch Loss: 1.181374 Tokens per Sec: 17399, Lr: 0.000300\n", "2020-01-12 16:18:15,115 Epoch 7 Step: 62700 Batch Loss: 1.199990 Tokens per Sec: 17745, Lr: 0.000300\n", "2020-01-12 16:18:26,909 Epoch 7: total training loss 12000.58\n", "2020-01-12 16:18:26,909 EPOCH 8\n", "2020-01-12 16:18:30,704 Epoch 8 Step: 62800 Batch Loss: 1.313637 Tokens per Sec: 10917, Lr: 0.000300\n", "2020-01-12 16:18:45,032 Epoch 8 Step: 62900 Batch Loss: 1.380165 Tokens per Sec: 17524, Lr: 0.000300\n", "2020-01-12 16:18:59,354 Epoch 8 Step: 63000 Batch Loss: 1.588774 Tokens per Sec: 17558, Lr: 0.000300\n", "2020-01-12 16:19:36,574 Hooray! New best validation result [ppl]!\n", "2020-01-12 16:19:36,574 Saving new checkpoint.\n", "2020-01-12 16:19:36,884 Example #0\n", "2020-01-12 16:19:36,884 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:19:36,885 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:19:36,885 \tHypothesis: Kambe , hi ntumbuluko , hi lava ku ntshunxiwa .\n", "2020-01-12 16:19:36,885 Example #1\n", "2020-01-12 16:19:36,885 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:19:36,885 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:19:36,885 \tHypothesis: 3 “ Hi Ku Langutela ”\n", "2020-01-12 16:19:36,885 Example #2\n", "2020-01-12 16:19:36,885 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:19:36,885 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:19:36,885 \tHypothesis: Loko hi tshika swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama ringanaka yo endla ntshunxeko , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:19:36,886 Example #3\n", "2020-01-12 16:19:36,886 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:19:36,886 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:19:36,886 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu naswona wu fungha ku va mutirheli\n", "2020-01-12 16:19:36,886 Validation result (greedy) at epoch 8, step 63000: bleu: 35.24, loss: 34351.1211, ppl: 3.2002, duration: 37.5319s\n", "2020-01-12 16:19:51,221 Epoch 8 Step: 63100 Batch Loss: 1.275986 Tokens per Sec: 17509, Lr: 0.000300\n", "2020-01-12 16:20:05,587 Epoch 8 Step: 63200 Batch Loss: 1.169914 Tokens per Sec: 17953, Lr: 0.000300\n", "2020-01-12 16:20:19,901 Epoch 8 Step: 63300 Batch Loss: 1.326279 Tokens per Sec: 17430, Lr: 0.000300\n", "2020-01-12 16:20:34,156 Epoch 8 Step: 63400 Batch Loss: 1.195761 Tokens per Sec: 17489, Lr: 0.000300\n", "2020-01-12 16:20:48,467 Epoch 8 Step: 63500 Batch Loss: 1.263841 Tokens per Sec: 17366, Lr: 0.000300\n", "2020-01-12 16:21:02,802 Epoch 8 Step: 63600 Batch Loss: 1.461023 Tokens per Sec: 17675, Lr: 0.000300\n", "2020-01-12 16:21:17,063 Epoch 8 Step: 63700 Batch Loss: 1.188698 Tokens per Sec: 17464, Lr: 0.000300\n", "2020-01-12 16:21:31,395 Epoch 8 Step: 63800 Batch Loss: 1.339535 Tokens per Sec: 17745, Lr: 0.000300\n", "2020-01-12 16:21:45,721 Epoch 8 Step: 63900 Batch Loss: 1.195882 Tokens per Sec: 17466, Lr: 0.000300\n", "2020-01-12 16:22:00,087 Epoch 8 Step: 64000 Batch Loss: 1.321579 Tokens per Sec: 17540, Lr: 0.000300\n", "2020-01-12 16:22:37,294 Example #0\n", "2020-01-12 16:22:37,294 \tSource: But , naturally , we also want relief .\n", "2020-01-12 16:22:37,294 \tReference: Kambe hi lava leswaku ku xaniseka ku herisiwa .\n", "2020-01-12 16:22:37,294 \tHypothesis: Kambe , hi ntolovelo , hi lava ku phasiwa .\n", "2020-01-12 16:22:37,294 Example #1\n", "2020-01-12 16:22:37,294 \tSource: 3 “ We’re Letting You Go ”\n", "2020-01-12 16:22:37,294 \tReference: 3 “ Ntirho Wa Wena Wu Herile ”\n", "2020-01-12 16:22:37,295 \tHypothesis: 3 “ Hi Ku Langutele ”\n", "2020-01-12 16:22:37,295 Example #2\n", "2020-01-12 16:22:37,295 \tSource: Paralleling our actions against terrorism , we must have equally vigorous efforts to enhance freedom , dignity , justice , and humanitarian values .\n", "2020-01-12 16:22:37,295 \tReference: Loko hi ri karhi hi lwisana ni vutherorisi , hi fanele hi endla matshalatshala lamakulu yo lwela ntshunxeko , xindzhuti , vululami ni ku antswisa timfanelo ta hina .\n", "2020-01-12 16:22:37,295 \tHypothesis: Loko hi pfumelela swiendlo swa hina swo lwisana ni vutherorisi , hi fanele hi va ni matshalatshala lama ringanaka yo kuma ntshunxeko , ku xiximiwa , vululami ni mimpimanyeto ya vanhu .\n", "2020-01-12 16:22:37,295 Example #3\n", "2020-01-12 16:22:37,296 \tSource: Baptism symbolizes dedication to God and marks one’s ordination as a minister\n", "2020-01-12 16:22:37,296 \tReference: Ku khuvuriwa ku kombisa ku tinyiketela eka Xikwembu naswona ku fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:22:37,296 \tHypothesis: Nkhuvulo wu kombisa ku tinyiketela eka Xikwembu naswona wu fungha ku vekiwa ka munhu tanihi mutirheli\n", "2020-01-12 16:22:37,296 Validation result (greedy) at epoch 8, step 64000: bleu: 35.16, loss: 34395.4688, ppl: 3.2050, duration: 37.2088s\n", "2020-01-12 16:22:51,662 Epoch 8 Step: 64100 Batch Loss: 1.430489 Tokens per Sec: 17585, Lr: 0.000300\n", "2020-01-12 16:23:05,915 Epoch 8 Step: 64200 Batch Loss: 1.391218 Tokens per Sec: 17055, Lr: 0.000300\n", "2020-01-12 16:23:20,144 Epoch 8 Step: 64300 Batch Loss: 1.338745 Tokens per Sec: 17340, Lr: 0.000300\n", "2020-01-12 16:23:34,531 Epoch 8 Step: 64400 Batch Loss: 1.278465 Tokens per Sec: 17569, Lr: 0.000300\n", "2020-01-12 16:23:49,061 Epoch 8 Step: 64500 Batch Loss: 1.212025 Tokens per Sec: 17517, Lr: 0.000300\n", "Traceback (most recent call last):\n", " File \"/usr/lib/python3.6/runpy.py\", line 193, in _run_module_as_main\n", " \"__main__\", mod_spec)\n", " File \"/usr/lib/python3.6/runpy.py\", line 85, in _run_code\n", " exec(code, run_globals)\n", " File \"/content/joeynmt/joeynmt/__main__.py\", line 41, in \n", " main()\n", " File \"/content/joeynmt/joeynmt/__main__.py\", line 29, in main\n", " train(cfg_file=args.config_path)\n", " File \"/content/joeynmt/joeynmt/training.py\", line 596, in train\n", " trainer.train_and_validate(train_data=train_data, valid_data=dev_data)\n", " File \"/content/joeynmt/joeynmt/training.py\", line 296, in train_and_validate\n", " batch_loss = self._train_batch(batch, update=update)\n", " File \"/content/joeynmt/joeynmt/training.py\", line 436, in _train_batch\n", " batch=batch, loss_function=self.loss)\n", " File \"/content/joeynmt/joeynmt/model.py\", line 133, in get_loss_for_batch\n", " trg_mask=batch.trg_mask)\n", " File \"/content/joeynmt/joeynmt/model.py\", line 80, in forward\n", " trg_mask=trg_mask)\n", " File \"/content/joeynmt/joeynmt/model.py\", line 117, in decode\n", " trg_mask=trg_mask)\n", " File \"/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py\", line 541, in __call__\n", " result = self.forward(*input, **kwargs)\n", " File \"/content/joeynmt/joeynmt/decoders.py\", line 521, in forward\n", " output = self.output_layer(x)\n", " File \"/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py\", line 541, in __call__\n", " result = self.forward(*input, **kwargs)\n", " File \"/usr/local/lib/python3.6/dist-packages/torch/nn/modules/linear.py\", line 87, in forward\n", " return F.linear(input, self.weight, self.bias)\n", " File \"/usr/local/lib/python3.6/dist-packages/torch/nn/functional.py\", line 1372, in linear\n", " output = input.matmul(weight.t())\n", "KeyboardInterrupt\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "MBoDS09JM807", "outputId": "cd6554e9-975b-4ecd-958a-d401635e1cff", "colab": { "base_uri": "https://localhost:8080/", "height": 54 } }, "source": [ "# Copy the created models from the notebook storage to google drive for persistant storage \n", "!cp -r joeynmt/models/${src}${tgt}_transformer/* \"$gdrive_path/models/${src}${tgt}_transformer/\"" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "cp: cannot create symbolic link '/content/drive/My Drive/masakhane/en-ts-jw300-baseline/models/ents_transformer/best.ckpt': Operation not supported\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "n94wlrCjVc17", "outputId": "32ee0396-2f54-483c-8ff0-ea604dd4534a", "colab": { "base_uri": "https://localhost:8080/", "height": 1000 } }, "source": [ "# Output our validation accuracy\n", "! cat \"$gdrive_path/models/${src}${tgt}_transformer/validations.txt\"" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Steps: 1000\tLoss: 115859.95312\tPPL: 50.56873\tbleu: 0.28609\tLR: 0.00030000\t*\n", "Steps: 2000\tLoss: 93454.92188\tPPL: 23.68017\tbleu: 1.25480\tLR: 0.00030000\t*\n", "Steps: 3000\tLoss: 82517.15625\tPPL: 16.35045\tbleu: 3.86151\tLR: 0.00030000\t*\n", "Steps: 4000\tLoss: 74959.91406\tPPL: 12.65873\tbleu: 7.51147\tLR: 0.00030000\t*\n", "Steps: 5000\tLoss: 69489.76562\tPPL: 10.51826\tbleu: 10.79874\tLR: 0.00030000\t*\n", "Steps: 6000\tLoss: 65908.36719\tPPL: 9.31696\tbleu: 12.64414\tLR: 0.00030000\t*\n", "Steps: 7000\tLoss: 62382.48438\tPPL: 8.26840\tbleu: 14.44985\tLR: 0.00030000\t*\n", "Steps: 8000\tLoss: 59565.63672\tPPL: 7.51615\tbleu: 16.71410\tLR: 0.00030000\t*\n", "Steps: 9000\tLoss: 56864.46094\tPPL: 6.85916\tbleu: 19.01188\tLR: 0.00030000\t*\n", "Steps: 10000\tLoss: 54950.11719\tPPL: 6.42862\tbleu: 19.83296\tLR: 0.00030000\t*\n", "Steps: 11000\tLoss: 54042.69531\tPPL: 6.23409\tbleu: 20.71138\tLR: 0.00030000\t*\n", "Steps: 12000\tLoss: 51465.35938\tPPL: 5.71307\tbleu: 22.63582\tLR: 0.00030000\t*\n", "Steps: 13000\tLoss: 50235.17969\tPPL: 5.47997\tbleu: 23.31744\tLR: 0.00030000\t*\n", "Steps: 14000\tLoss: 48921.69531\tPPL: 5.24157\tbleu: 24.47259\tLR: 0.00030000\t*\n", "Steps: 15000\tLoss: 48019.04688\tPPL: 5.08378\tbleu: 24.90419\tLR: 0.00030000\t*\n", "Steps: 16000\tLoss: 47117.40234\tPPL: 4.93091\tbleu: 25.99035\tLR: 0.00030000\t*\n", "Steps: 17000\tLoss: 46240.28906\tPPL: 4.78661\tbleu: 26.51280\tLR: 0.00030000\t*\n", "Steps: 18000\tLoss: 45468.56641\tPPL: 4.66314\tbleu: 26.79090\tLR: 0.00030000\t*\n", "Steps: 19000\tLoss: 45205.83203\tPPL: 4.62184\tbleu: 27.01206\tLR: 0.00030000\t*\n", "Steps: 20000\tLoss: 44536.96875\tPPL: 4.51833\tbleu: 27.39605\tLR: 0.00030000\t*\n", "Steps: 21000\tLoss: 43765.89844\tPPL: 4.40188\tbleu: 28.46029\tLR: 0.00030000\t*\n", "Steps: 22000\tLoss: 43220.50781\tPPL: 4.32133\tbleu: 28.55713\tLR: 0.00030000\t*\n", "Steps: 23000\tLoss: 42786.58594\tPPL: 4.25830\tbleu: 29.40799\tLR: 0.00030000\t*\n", "Steps: 24000\tLoss: 42366.45312\tPPL: 4.19815\tbleu: 29.07835\tLR: 0.00030000\t*\n", "Steps: 25000\tLoss: 41729.58203\tPPL: 4.10858\tbleu: 29.84291\tLR: 0.00030000\t*\n", "Steps: 26000\tLoss: 41416.13672\tPPL: 4.06520\tbleu: 29.98972\tLR: 0.00030000\t*\n", "Steps: 27000\tLoss: 41221.91797\tPPL: 4.03855\tbleu: 30.07309\tLR: 0.00030000\t*\n", "Steps: 28000\tLoss: 41374.14062\tPPL: 4.05942\tbleu: 29.75493\tLR: 0.00030000\t\n", "Steps: 29000\tLoss: 40627.22266\tPPL: 3.95804\tbleu: 30.57743\tLR: 0.00030000\t*\n", "Steps: 30000\tLoss: 40022.15234\tPPL: 3.87776\tbleu: 30.56523\tLR: 0.00030000\t*\n", "Steps: 31000\tLoss: 39652.46875\tPPL: 3.82952\tbleu: 31.36031\tLR: 0.00030000\t*\n", "Steps: 32000\tLoss: 39589.60547\tPPL: 3.82138\tbleu: 31.37947\tLR: 0.00030000\t*\n", "Steps: 33000\tLoss: 39230.70312\tPPL: 3.77522\tbleu: 31.40688\tLR: 0.00030000\t*\n", "Steps: 34000\tLoss: 38992.87109\tPPL: 3.74494\tbleu: 31.84697\tLR: 0.00030000\t*\n", "Steps: 35000\tLoss: 38774.51562\tPPL: 3.71735\tbleu: 32.41971\tLR: 0.00030000\t*\n", "Steps: 36000\tLoss: 38793.93750\tPPL: 3.71979\tbleu: 32.22839\tLR: 0.00030000\t\n", "Steps: 37000\tLoss: 38262.83984\tPPL: 3.65349\tbleu: 32.07455\tLR: 0.00030000\t*\n", "Steps: 38000\tLoss: 37997.42578\tPPL: 3.62080\tbleu: 32.46436\tLR: 0.00030000\t*\n", "Steps: 39000\tLoss: 37956.53516\tPPL: 3.61579\tbleu: 32.73536\tLR: 0.00030000\t*\n", "Steps: 40000\tLoss: 37645.30469\tPPL: 3.57789\tbleu: 32.51147\tLR: 0.00030000\t*\n", "Steps: 41000\tLoss: 37480.39453\tPPL: 3.55796\tbleu: 33.16825\tLR: 0.00030000\t*\n", "Steps: 42000\tLoss: 37400.12109\tPPL: 3.54830\tbleu: 33.39540\tLR: 0.00030000\t*\n", "Steps: 43000\tLoss: 37119.00000\tPPL: 3.51469\tbleu: 33.06072\tLR: 0.00030000\t*\n", "Steps: 44000\tLoss: 37003.76172\tPPL: 3.50100\tbleu: 33.25878\tLR: 0.00030000\t*\n", "Steps: 45000\tLoss: 36675.69922\tPPL: 3.46232\tbleu: 33.86254\tLR: 0.00030000\t*\n", "Steps: 46000\tLoss: 36625.67188\tPPL: 3.45646\tbleu: 33.80602\tLR: 0.00030000\t*\n", "Steps: 47000\tLoss: 36611.36719\tPPL: 3.45479\tbleu: 33.41057\tLR: 0.00030000\t*\n", "Steps: 48000\tLoss: 36390.11719\tPPL: 3.42900\tbleu: 34.23604\tLR: 0.00030000\t*\n", "Steps: 49000\tLoss: 36214.18359\tPPL: 3.40863\tbleu: 33.35375\tLR: 0.00030000\t*\n", "Steps: 50000\tLoss: 35913.88281\tPPL: 3.37414\tbleu: 33.79505\tLR: 0.00030000\t*\n", "Steps: 51000\tLoss: 35825.21875\tPPL: 3.36403\tbleu: 34.02975\tLR: 0.00030000\t*\n", "Steps: 52000\tLoss: 35829.19531\tPPL: 3.36448\tbleu: 33.76530\tLR: 0.00030000\t\n", "Steps: 53000\tLoss: 35491.43359\tPPL: 3.32622\tbleu: 33.85946\tLR: 0.00030000\t*\n", "Steps: 54000\tLoss: 35431.94141\tPPL: 3.31953\tbleu: 34.58088\tLR: 0.00030000\t*\n", "Steps: 55000\tLoss: 35645.22266\tPPL: 3.34359\tbleu: 34.30900\tLR: 0.00030000\t\n", "Steps: 56000\tLoss: 35330.90625\tPPL: 3.30819\tbleu: 34.35627\tLR: 0.00030000\t*\n", "Steps: 57000\tLoss: 35191.98438\tPPL: 3.29266\tbleu: 34.32982\tLR: 0.00030000\t*\n", "Steps: 58000\tLoss: 34966.10547\tPPL: 3.26757\tbleu: 35.06336\tLR: 0.00030000\t*\n", "Steps: 59000\tLoss: 34941.97656\tPPL: 3.26490\tbleu: 34.94208\tLR: 0.00030000\t*\n", "Steps: 60000\tLoss: 34747.31250\tPPL: 3.24345\tbleu: 34.69774\tLR: 0.00030000\t*\n", "Steps: 61000\tLoss: 34724.36328\tPPL: 3.24093\tbleu: 35.06140\tLR: 0.00030000\t*\n", "Steps: 62000\tLoss: 34700.90625\tPPL: 3.23836\tbleu: 35.64704\tLR: 0.00030000\t*\n", "Steps: 63000\tLoss: 34351.12109\tPPL: 3.20023\tbleu: 35.24304\tLR: 0.00030000\t*\n", "Steps: 64000\tLoss: 34395.46875\tPPL: 3.20504\tbleu: 35.16225\tLR: 0.00030000\t\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "66WhRE9lIhoD", "outputId": "0a7b10ca-3824-4917-90c2-8d179722d706", "colab": { "base_uri": "https://localhost:8080/", "height": 67 } }, "source": [ "# Test our model\n", "! cd joeynmt; python3 -m joeynmt test \"$gdrive_path/models/${src}${tgt}_transformer/config.yaml\"" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "2020-01-12 16:24:28,116 Hello! This is Joey-NMT.\n", "2020-01-12 16:25:16,568 dev bleu: 35.07 [Beam search decoding with beam size = 5 and alpha = 1.0]\n", "2020-01-12 16:26:05,704 test bleu: 44.15 [Beam search decoding with beam size = 5 and alpha = 1.0]\n" ], "name": "stdout" } ] } ] }