Controlled Image Variability via Diffusion Processes
dc.contributor.author | Zhu, Yueze | en_US |
dc.contributor.author | Mitra, Niloy J. | en_US |
dc.contributor.editor | Ceylan, Duygu | en_US |
dc.contributor.editor | Li, Tzu-Mao | en_US |
dc.date.accessioned | 2025-05-09T09:36:17Z | |
dc.date.available | 2025-05-09T09:36:17Z | |
dc.date.issued | 2025 | |
dc.description.abstract | Diffusion models have shown remarkable abilities in generating realistic images. Unfortunately, diffusion processes do not directly produce diverse samples. Recent work has addressed this problem by applying a joint-particle time-evolving potential force that encourages varied and distinct generations. However, such a method focuses on improving the diversity across any batch of generation rather than producing variations of a specific sample. In this paper, we propose a method for creating subtle variations of a single (generated) image - specifically, we propose Single Sample Refinement, a simple and training-free method to improve the diversity of one specific sample at different levels of variability. This mode is useful for creative content generation, allowing users to explore controlled variations without sacrificing the identity of the main objects. | en_US |
dc.description.sectionheaders | Short Paper 4 | |
dc.description.seriesinformation | Eurographics 2025 - Short Papers | |
dc.identifier.doi | 10.2312/egs.20251044 | |
dc.identifier.isbn | 978-3-03868-268-4 | |
dc.identifier.issn | 1017-4656 | |
dc.identifier.pages | 4 pages | |
dc.identifier.uri | https://doi.org/10.2312/egs.20251044 | |
dc.identifier.uri | https://diglib.eg.org/handle/10.2312/egs20251044 | |
dc.publisher | The Eurographics Association | en_US |
dc.rights | Attribution 4.0 International License | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.title | Controlled Image Variability via Diffusion Processes | en_US |
Files
Original bundle
1 - 1 of 1