Getting Bizarre Error

#1
by chenbowen-184 - opened

I'm getting this error while loading in vllm, it's so weird the model is literally named AWQ
ValueError: torch.bfloat16 is not supported for quantization method awq. Supported dtypes: [torch.float16]

Sign up or log in to comment