Datasets:

Modalities:
Text
Formats:
json
Languages:
Japanese
ArXiv:
Libraries:
Datasets
pandas
License:

Some of data is missing, corrupted, or in weird format that causes issues with some parsers

#2
by nonetrix - opened

When trying to train a Lora model with https://github.com/oobabooga/text-generation-webui I wasn't able to get very far without removing many lines of json that couldn't be parsed or was completely empty. I saw a lot of lines that looked like this that caused errors with the parser trying to parse broken output fields. Here are what some of those lines looked like:

"{'instruction': '入力されたワードを説明してください。', 'input': '馬渕川', 'output':'     '
"{'instruction': '入力されたワードを説明してください。', 'input': '馬渕川', 'output': '\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000'}"

Here are some of the errors as well:

Traceback (most recent call last):
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/queueing.py", line 407, in call_prediction
    output = await route_utils.call_process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/route_utils.py", line 226, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1550, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1199, in call_function
    prediction = await utils.async_iteration(iterator)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 519, in async_iteration
    return await iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 512, in __anext__
    return await anyio.to_thread.run_sync(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 495, in run_sync_iterator_async
    return next(iterator)
           ^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 649, in gen_wrapper
    yield from f(*args, **kwargs)
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 491, in do_train
    train_data = data['train'].map(generate_and_tokenize_prompt, new_fingerprint='%030x' % random.randrange(16**30))
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 592, in wrapper
    out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 557, in wrapper
    out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3093, in map
    for rank, done, content in Dataset._map_single(**dataset_kwargs):
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3446, in _map_single
    example = apply_function_on_filtered_inputs(example, i, offset=offset)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3349, in apply_function_on_filtered_inputs
    processed_inputs = function(*fn_args, *additional_args, **fn_kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 486, in generate_and_tokenize_prompt
    prompt = generate_prompt(data_point)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 483, in generate_prompt
    raise RuntimeError(f'Data-point "{data_point}" has no keyset match within format "{list(format_data.keys())}"')
RuntimeError: Data-point "{'instruction': '入力されたワードを説明してください。', 'input': '馬渕川', 'output': '\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000\u3000'}" has no keyset match within format "['instruction,output', 'instruction,input,output']"
Traceback (most recent call last):
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/queueing.py", line 407, in call_prediction
    output = await route_utils.call_process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/route_utils.py", line 226, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1550, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1199, in call_function
    prediction = await utils.async_iteration(iterator)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 519, in async_iteration
    return await iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 512, in __anext__
    return await anyio.to_thread.run_sync(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 495, in run_sync_iterator_async
    return next(iterator)
           ^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 649, in gen_wrapper
    yield from f(*args, **kwargs)
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 491, in do_train
    train_data = data['train'].map(generate_and_tokenize_prompt, new_fingerprint='%030x' % random.randrange(16**30))
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 592, in wrapper
    out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 557, in wrapper
    out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3093, in map
    for rank, done, content in Dataset._map_single(**dataset_kwargs):
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3446, in _map_single
    example = apply_function_on_filtered_inputs(example, i, offset=offset)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3349, in apply_function_on_filtered_inputs
    processed_inputs = function(*fn_args, *additional_args, **fn_kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 486, in generate_and_tokenize_prompt
    prompt = generate_prompt(data_point)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 483, in generate_prompt
    raise RuntimeError(f'Data-point "{data_point}" has no keyset match within format "{list(format_data.keys())}"')
RuntimeError: Data-point "{'instruction': '入力されたワードを説明してください。', 'input': '岡倉葉子', 'output': '\u3000\u3000\u3000'}" has no keyset match within format "['instruction,output', 'instruction,input,output']"
Traceback (most recent call last):
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/queueing.py", line 407, in call_prediction
    output = await route_utils.call_process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/route_utils.py", line 226, in call_process_api
    output = await app.get_blocks().process_api(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1550, in process_api
    result = await self.call_function(
             ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/blocks.py", line 1199, in call_function
    prediction = await utils.async_iteration(iterator)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 519, in async_iteration
    return await iterator.__anext__()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 512, in __anext__
    return await anyio.to_thread.run_sync(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
    return await get_async_backend().run_sync_in_worker_thread(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
    return await future
           ^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
    result = context.run(func, *args)
             ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 495, in run_sync_iterator_async
    return next(iterator)
           ^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/gradio/utils.py", line 649, in gen_wrapper
    yield from f(*args, **kwargs)
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 491, in do_train
    train_data = data['train'].map(generate_and_tokenize_prompt, new_fingerprint='%030x' % random.randrange(16**30))
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 592, in wrapper
    out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 557, in wrapper
    out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3093, in map
    for rank, done, content in Dataset._map_single(**dataset_kwargs):
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3446, in _map_single
    example = apply_function_on_filtered_inputs(example, i, offset=offset)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/installer_files/env/lib/python3.11/site-packages/datasets/arrow_dataset.py", line 3349, in apply_function_on_filtered_inputs
    processed_inputs = function(*fn_args, *additional_args, **fn_kwargs)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 486, in generate_and_tokenize_prompt
    prompt = generate_prompt(data_point)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/noah/Documents/AI/text-generation-webui/modules/training.py", line 483, in generate_prompt
    raise RuntimeError(f'Data-point "{data_point}" has no keyset match within format "{list(format_data.keys())}"')
RuntimeError: Data-point "{'instruction': '入力されたワードを説明してください。', 'input': '曇りのち、快晴', 'output': ' \n \n '}" has no keyset match within format "['instruction,output', 'instruction,input,output']"

As you can see all of them have the theme of having the sentence "入力されたワードを説明してください。" as the instruction, however there could be more instances later as I have only been able to parse 76% of the json file. I imagine that these examples for the AI come from the same source and that source has a lot of broken data for whatever reason. It might be possible for me to use another backend for training to bypass, but even if they where parsed correctly, it would be highly likely that the broken output would reduce model quality as some of them are just blank spaces or random new lines. This dataset is great start, but I think it needs a lot more filtering and refining as is

Izumi Lab. org

Thank you for reporting the issue.
We have identified that the problem you mentioned occurs when the source is Wikipedia and the article is a redirect page, resulting in the generation of outputs with only blank spaces.
Therefore, we will make adjustments to the filtering process to remove the samples which have blank outputs.

Izumi Lab. org

We fixed the issue in v1.0.2.

Will close if I don't run into anymore issues when I try again, thanks!

masanorihirano changed discussion status to closed

Sign up or log in to comment