Translation
Transformers
Inference Endpoints

Loading model from Transformers

#9
by asyaasinkson - opened

Hello! I can't load model from Transformers library.
Although it is written on the site that i can access it like that:

Load model directly

from transformers import AutoModel
model = AutoModel.from_pretrained("Unbabel/wmt22-cometkiwi-da")
It doesn't work for me and i get an error File "/usr/local/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1038, in from_pretrained
raise ValueError(
ValueError: Unrecognized model in Unbabel/wmt22-comet-da. Should have a model_type key in its config.json, or contain one of the following strings in its name: albert, align, altclip, audio-spectrogram-transformer, autoformer, bark, bart, beit, bert .......
What can i do?

Unbabel org

Hey! You need to install unbabel-comet and load the model like its shown in the readme

Hello! Thank you for the answer, the problem is that i want to store the model to a concrete folder in a docker container and that is why is needed to use AutoModel.from_pretrained("Unbabel/wmt22-cometkiwi-da", cache-dir = "app/cache") , is there any way to do that?

Unbabel org

if you download the model to a specific folder you can always read the checkpoint from there with the load_from_checkpoint function.

Sign up or log in to comment