Error when I load the model

#122
by AlexTian - opened

I got the 'PhiConfig' object has no attribute 'n_embd', what should I do?

I am trying to load it from huggingface and convert the trained model into an RKLLM format model, but the error occurred. I think it might caused by the update, can anyone tell me the version of phi-2 without this update?

THx🙏

Hi @AlexTian , can you post a short code snippet so we can try to reproduce?

I have finished this by using the version 834565c. I think most developers haven't updated their code to convert the latest model. Hope there can be more notes on the model card to inform people about this update.
This is my code, the code example is from airockchip rkllm https://github.com/airockchip/rknn-llm/tree/main :

from rkllm.api import RKLLM

modelpath = './phi-2'
llm = RKLLM()

# Load model
ret = llm.load_huggingface(model = modelpath)
if ret != 0:
    print('Load model failed!')
    exit(ret)

# Build model
ret = llm.build(do_quantization=True, optimization_level=1, quantized_dtype='w8a8', target_platform='rk3588')
if ret != 0:
    print('Build model failed!')
    exit(ret)

# Export rknn model
ret = llm.export_rkllm("./phi.rkllm")
if ret != 0:
    print('Export model failed!')
    exit(ret)
AlexTian changed discussion status to closed

Sign up or log in to comment