Alizee's picture
Update README.md
3d06504 verified
|
raw
history blame
1.3 kB
metadata
language:
  - fr
size_categories:
  - 100K<n<1M
task_categories:
  - token-classification
pretty_name: wikiner_fr
dataset_info:
  features:
    - name: id
      dtype: int64
    - name: tokens
      sequence: string
    - name: ner_tags
      sequence:
        class_label:
          names:
            '0': O
            '1': LOC
            '2': PER
            '3': MISC
            '4': ORG
  splits:
    - name: train
      num_bytes: 54139057
      num_examples: 120060
    - name: test
      num_bytes: 5952227
      num_examples: 13393
  download_size: 15572314
  dataset_size: 60091284
configs:
  - config_name: default
    data_files:
      - split: train
        path: data/train-*
      - split: test
        path: data/test-*

Dataset Card for "wikiner_fr_mixed_caps"

This is an update on the dataset Jean-Baptiste/wikiner_fr with:

  • removal of duplicated examples and leakage
  • random de-capitalization of words (20%)

You can see the code to create the changes in the script update_dataset.py in the repository.

Dataset Description (reproduced from original repo):