Tokenizer class BitnetTokenizer does not exist or is not currently imported.

#1
by qmsoqm - opened

I'm getting this error when I try to run the model.

My code:

'''

Import necessary libraries

from transformers import AutoTokenizer, AutoModelWithLMHead, pipeline

Specify the model name

model_name = "1bitLLM/bitnet_b1_58-large"

Load the tokenizer and the model

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelWithLMHead.from_pretrained(model_name)

Create the pipeline

pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)

Define the prompt

prompt = """
The weather in South Korea is like
"""

Generate the text

print(pipe(prompt))
'''

I'm using transformers 4.39.3.


I've found this github issue which seems relevant to this issue.

I changed the tokenizer_class in tokenizer_config.json and the problem was resolved. but

ValueError: Cannot use chat template functions because tokenizer.chat_template is not set and no template argument was passed! For information about writing templates and setting the tokenizer.chat_template attribute, please see the documentation at https://huggingface.co/docs/transformers/main/en/chat_templating

The following problem occurred. Was there a solution to it?

Sign up or log in to comment