onnx model serving. what data input should be provided.
Hi,
I have used optimum to convert this model to onnx format and am try to serve using model serving.
When I directly use the model from local folder it works fine and gives logits but when I try to use api it doesn't work.
What is the data input I need to pass to the model when working using api (model serving). Where can I find the correct data to be provided.
This is my code and error. Can someone guid on what can I do to fix this.
API_URL = "my-url.com/v2/models/sentiment-classification/infer"
headers = {"content-type": "application/json"}
β********************************
def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()
output = query({
"inputs": "I like you."
})
β
print(output)
send: b'POST /v2/models/sentiment-classification/infer HTTP/1.1\r\nHost: my-url.com\r\nUser-Agent: python-requests/2.31.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\ncontent-type: application/json\r\nContent-Length: 25\r\n\r\n'
send: b'{"inputs": "I like you."}'
reply: 'HTTP/1.1 400 Bad Request\r\n'
header: content-type: application/json
header: date: Sat, 08 Jul 2023 11:41:03 GMT
header: content-length: 119
header: set-cookie: XXXXX=XXXXX; path=/; HttpOnly; Secure; SameSite=None
{'code': 3, 'message': 'json: cannot unmarshal string into Go struct field RESTRequest.inputs of type []main.InputTensor'}