Spaces:
Runtime error
Runtime error
Openai??
#2
by
jostelo
- opened
The space is not running with a GPU. They are probably just running a openai compatible backend like vllm or tgi (somewhere else, maybe on the hessianai cluster?). That's why they change the api_base to their deployment server.
So no it is not just using openai's models! Without the key it wouldn't work ๐
Thank you very much for the reply! I guess, I understood. They basically mimicked the openai-api for their model, correct? - that is what got me confused.
Yes many inference libraries mimick the openai api, therefore making it possible to use the oai pip package to use for inference just by changing the api endpoint.