Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yichaodu
/
DiffusionDPO-bias-internvl-1.5
like
0
Text-to-Image
Diffusers
Safetensors
stable-diffusion
stable-diffusion-diffusers
DPO
DiffusionDPO
arxiv:
2407.04842
Model card
Files
Files and versions
Community
Use this model
main
DiffusionDPO-bias-internvl-1.5
/
README.md
Commit History
Upload README.md with huggingface_hub
3159715
verified
yichaodu
commited on
Jul 9
Upload README.md with huggingface_hub
a499ea9
verified
yichaodu
commited on
Jun 20
Upload README.md with huggingface_hub
ba4a97a
verified
yichaodu
commited on
Jun 20
Upload README.md with huggingface_hub
2c5321d
verified
yichaodu
commited on
Jun 19