export WizardCoder to ONNX
#20
by
Ryan30
- opened
I want to convert WizardCoder to ONNX.
According to https://huggingface.co/docs/optimum/onnxruntime/usage_guides/models, I write below code to export:
from optimum.onnxruntime import ORTModelForCausalLM
model_path = "../WizardCoder-15B-V1.0"
onnxModel = ORTModelForCausalLM.from_pretrained(model_path, export=True)
It report error below:
ValueError: Trying to export a gpt_bigcode model, that is a custom or unsupported architecture for the task text-generation, but no custom onnx configuration was passed as
`custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an
example on how to export custom models. For the task text-generation, the Optimum ONNX exporter supports natively the architectures: ['bart', 'blenderbot', 'blenderbot_small',
'bloom', 'codegen', 'gpt2', 'gptj', 'gpt_neo', 'gpt_neox', 'marian', 'mbart', 'opt', 'llama', 'pegasus'].
It like need to implement ONNXConfig.
Has anyone implemented that?