Controlled Image Variability via Diffusion Processes

dc.contributor.authorZhu, Yuezeen_US
dc.contributor.authorMitra, Niloy J.en_US
dc.contributor.editorCeylan, Duyguen_US
dc.contributor.editorLi, Tzu-Maoen_US
dc.date.accessioned2025-05-09T09:36:17Z
dc.date.available2025-05-09T09:36:17Z
dc.date.issued2025
dc.description.abstractDiffusion models have shown remarkable abilities in generating realistic images. Unfortunately, diffusion processes do not directly produce diverse samples. Recent work has addressed this problem by applying a joint-particle time-evolving potential force that encourages varied and distinct generations. However, such a method focuses on improving the diversity across any batch of generation rather than producing variations of a specific sample. In this paper, we propose a method for creating subtle variations of a single (generated) image - specifically, we propose Single Sample Refinement, a simple and training-free method to improve the diversity of one specific sample at different levels of variability. This mode is useful for creative content generation, allowing users to explore controlled variations without sacrificing the identity of the main objects.en_US
dc.description.sectionheadersShort Paper 4
dc.description.seriesinformationEurographics 2025 - Short Papers
dc.identifier.doi10.2312/egs.20251044
dc.identifier.isbn978-3-03868-268-4
dc.identifier.issn1017-4656
dc.identifier.pages4 pages
dc.identifier.urihttps://doi.org/10.2312/egs.20251044
dc.identifier.urihttps://diglib.eg.org/handle/10.2312/egs20251044
dc.publisherThe Eurographics Associationen_US
dc.rightsAttribution 4.0 International License
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.titleControlled Image Variability via Diffusion Processesen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
egs20251044.pdf
Size:
18.73 MB
Format:
Adobe Portable Document Format