Hosted inference API
#4
by
alexgradinaru
- opened
Will this model be available in the Hosted inference API ?
I get an error when trying to test online or locally: AutoPipeline can't find a pipeline linked to StableDiffusionLDM3DPipeline for None
It works great with the diffusers pipeline.
Hi!
The hosted inference API wont work for LDM3D since our model has 2 outputs and the API only support one task per model, like text->image, image-to-image, image-depth.
If you want to use it you could also you the HF space we open sourced https://huggingface.co/spaces/Intel/ldm3d
Its using a different checkpoint but at least you ll have an idea of the output
Estelle
is there any colab notebook for this ?