bert_finetuning_test1227_hug / tokenizer_config.json
junzai's picture
commit from $USER
f7c8f1d
raw
history blame contribute delete
59 Bytes
{"do_lower_case": false, "max_len": 512, "init_inputs": []}