|
--- |
|
title: hirol/control_any5_openpose |
|
emoji: π |
|
license: mit |
|
--- |
|
Here is model, add character skeletons to a forest scene generated by SD, and keep the background unchanged to generate controllable characters |
|
|
|
https://huggingface.co/spaces/hirol/controlnetOverMask |
|
|
|
# ControlnetWithBackground |
|
### Controlnet |
|
### Inpainting |
|
### Stable diffusion |
|
|
|
Use controlnet to generate sketches and characters, and keep the background unchanged. |
|
|
|
Usually when we generate a very perfect background image, we want to add image elements, but using controlnet directly will affect the original background. This project aims to add elements to the page while keeping the background unchanged, and can directly operate on the original background. |
|
|
|
Support skeletal character generation and sketch generation. |
|
|
|
Optimize an inpaint model for the general domain against the stablediffusion-inpaint model. |
|
|
|
|
|
``` python |
|
from diffusers import StableDiffusionControlNetPipeline, ControlNetModel, UniPCMultistepScheduler |
|
|
|
controlnet = ControlNetModel.from_pretrained("./models/control_any5_openpose", torch_dtype=torch.float16) |
|
``` |
|
|
|
|
|
### Two modes, openpose control and manuscript control |
|
#### openpose control |
|
Add character skeletons to a forest scene generated by SD, and keep the background unchanged to generate controllable characters |
|
<img src="https://huggingface.co/spaces/hirol/ControlnetWithBackground/resolve/main/person.png" width="400" height="300"> |
|
<video width="320" height="240" controls> |
|
<source src="https://huggingface.co/spaces/hirol/ControlnetWithBackground/resolve/main/person_control.mp4" type="video/mp4"> |
|
</video> |
|
|