dynamic rope_scaling on internlm2_5-7b llamafied

#862
by ethanc8 - opened

Hi, I converted https://huggingface.co/internlm/internlm2_5-7b to Llama safetensors so that it doesn't need use_remote_code: https://huggingface.co/ethanc8/internlm2_5-7b-llamafied. However, when submitting it to the leaderboard, I got:

Model "ethanc8/internlm2_5-7b-llamafied" was not found or misconfigured on the hub! Error raised was rope_type

I'm not sure, but this might be related to my config.json:

  "rope_scaling": {
    "factor": 2.0,
    "type": "dynamic"
  },
Open LLM Leaderboard org

Hi!
I did not manage to reproduce your error, could you provide the full parameters you used to submit?

I don't have all the paramaters, but I used https://huggingface.co/ethanc8/internlm2_5-7b-llamafied and specified bfloat16.

Open LLM Leaderboard org

Can you try again, and provide me with 1) all the parameters of the form and 2) a screenshot of your screen with the full params + error message?

I will try later today.

image.png

Open LLM Leaderboard org

Hi!
Thanks for the information, I managed to reproduce the error.
We are using transformers=4.43.1 on the leaderboard, did you make sure it was possible to load your model using AutoConfig (as indicated in the submit tab)?

The following

AutoConfig.from_pretrained('ethanc8/internlm2_5-7b-llamafied', revision='58482e6989cb3b09da8f20d6ab9101b922c53acb')

fails.

I will try that.

Open LLM Leaderboard org

If your model could load in earlier versions of transformers, I invite you to open an issue on the github repo, and if not, to fix your model.

Closing as we can't do anything for the moment, but ping me once it's good!

clefourrier changed discussion status to closed

Sign up or log in to comment