Error while testing inference
#1
by
palloo
- opened
Could not load model ldilov/stablelm-tuned-alpha-7b-4bit-128g-descact-sym-true-sequential with any of the following classes: (<class 'transformers.models.gpt_neox.modeling_gpt_neox.GPTNeoXForCausalLM
Can you provide more context of the error, how do u load it, do u have minimum code required to reproduce the issue ?
I used the default huggingface spaces with A10G machine
I cant say about hugging face spaces, no idea what software are they running. I Advise you to use AutoGPTQ with Transformers to run it locally.
Alright , yeah most of the transformers over the spaces doesnt runs , I tried
palloo
changed discussion status to
closed