Add chat_template to tokenizer_config.json
#39
by
irenedea
- opened
Manually tested with
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('mpt-7b-chat-tokenizer')
chat = [
{"role": "system", "content": "This is a prompt!"},
{"role": "user", "content": "Hello, how are you?"},
{"role": "assistant", "content": "I'm doing great. How can I help you today?"},
{"role": "user", "content": "I'd like to show off how chat templating works!"},
]
print(tokenizer.apply_chat_template(chat, tokenize=False))
where mpt-7b-chat tokenizer is a local folder that includes the modified tokenizer_config.json
irenedea
changed pull request status to
closed
ah yeah this PR is closed, the json is missing a comment. Can you try the manual test described in https://huggingface.co/mosaicml/mpt-7b-chat/discussions/40