Weights do not work (non-128g weights do work!)
#2
by
ryanlarimer
- opened
Tried in both repositories with main
branch of transformers:
https://github.com/qwopqwop200/GPTQ-for-LLaMa
https://github.com/oobabooga/GPTQ-for-LLaMa
RuntimeError: Error(s) in loading state_dict for LlamaForCausalLM:
Missing key(s) in state_dict: "model.layers.0.self_attn.k_proj.bias", "m
odel.layers.0.self_attn.o_proj.bias", ....
then
Unexpected key(s) in state_dict: "model.layers.0.self_attn.k_proj.g_idx"
, "model.layers.0.self_attn.o_proj.g_idx",...
These weights load fine: https://huggingface.co/MetaIX/GPT4-X-Alpaca-30B-Int4