How to set trust_remote_code to true?
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from primary with message "{
"code": 400,
"type": "InternalServerException",
"message": "Loading /.sagemaker/mms/models/tiiuae__falcon-40b requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code\u003dTrue
to remove this error."
}
qa = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype="auto",
trust_remote_code=True,
device_map="auto",
)
what if we run it through AWS Sagemaker?
deploy model to SageMaker Inference
predictor = huggingface_model.deploy(
initial_instance_count=1, # number of instances
instance_type='ml.m5.xlarge', # ec2 instance type
trust_remote_code=True,
)
@mvoisin
- setting trust_remote_code=True
still gave me the same error
This worked for me:
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = AutoModelForCausalLM.from_pretrained("tiiuae/falcon-40b", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("tiiuae/falcon-40b")''
I failed with all possible combinations to overcome this issue, so I ended up changing the raising this exception. For those using SageMaker Jupyeter:
!sed -i 's/if not trust_remote_code:/if False: # Manually replace in-line to avoid #9/g' ~/anaconda3/envs/amazonei_pytorch_latest_p37/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py
You can now confirm it has been changed with:
!cat ~/anaconda3/envs/amazonei_pytorch_latest_p37/lib/python3.7/site-packages/transformers/models/auto/configuration_auto.py
Facing the same issue, following this tutorial https://huggingface.co/blog/sagemaker-huggingface-llm to deploy falcon 7b on aws via sagemaker, getting the same error and don't know where to set trust_remote_code. please help
Same here!!
Same for me too - from the tutorial there's no mention of the trust setting for falcon and have not figured it out yet either.
Hey Everyone, I've been trying with https://gist.github.com/timesler/4b244a6b73d6e02d17fd220fd92dfaec as well, I'm having issues with the tar.gz file but please try and see what happens.
Looks like they're is working on releasing the new 0.8.2 version of the huggingface llm inference docker image to sagemaker based on this thread: https://github.com/huggingface/text-generation-inference/issues/390
That should fix this issue based on this blog: https://huggingface.co/blog/falcon
Hi I aml using Hugginggface estimator for the training job and I get this same error.I am using the transformers gitrepo tokenclassification ,run_ner.py as the entry point.But the it throws the following error.
ValueError: Loading tiiuae/falcon-40b requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True
to remove this error." Command "/opt/conda/bin/python3.9 run_ner.py --dataset_name JayalekshmiGopakumar/falcon_doclayn, exit code: 1
I had the same issue, resolved it with updating to transformers==4.34.0