VMV2023
Permanent URI for this collection
Browse
Browsing VMV2023 by Subject "based rendering"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Art-directable Stroke-based Rendering on Mobile Devices(The Eurographics Association, 2023) Wagner, Ronja; Schulz, Sebastian; Reimann, Max; Semmo, Amir; Döllner, Jürgen; Trapp, Matthias; Guthe, Michael; Grosch, ThorstenThis paper introduces an art-directable stroke-based rendering technique for transforming photos into painterly renditions on mobile devices. Unlike previous approaches that rely on time-consuming iterative computations and explicit brush-stroke geometry, our method offers a interactive image-based implementation tailored to the capabilities of modern mobile devices. The technique places curved brush strokes in multiple passes, leveraging a texture bombing algorithm. To maintain and highlight essential details for stylization, we incorporate additional information such as image salience, depth, and facial landmarks as parameters. Our technique enables a user to control and manipulate using a wide range of parameters and masks during editing to adjust and refine the stylized image. The result is an interactive painterly stylization tool that supports high-resolution input images, providing users with an immersive and engaging artistic experience on their mobile devices.Item N-SfC: Robust and Fast Shape Estimation from Caustic Images(The Eurographics Association, 2023) Kassubeck, Marc; Kappel, Moritz; Castillo, Susana; Magnor, Marcus; Guthe, Michael; Grosch, ThorstenThis paper handles the highly challenging problem of reconstructing the shape of a refracting object from a single image of its resulting caustic. Due to the ubiquity of transparent refracting objects in everyday life, reconstruction of their shape entails a multitude of practical applications. While we focus our attention on inline shape reconstruction in glass fabrication processes, our methodology could be adapted to scenarios where the limiting factor is a lack of input measurements to constrain the reconstruction problem completely. The recent Shape from Caustics (SfC) method casts this problem as the inverse of a light propagation simulation for synthesis of the caustic image, that can be solved by a differentiable renderer. However, the inherent complexity of light transport through refracting surfaces currently limits the practical application due to reconstruction speed and robustness. Thus, we introduce Neural-Shape from Caustics (N-SfC), a learning-based extension incorporating two components into the reconstruction pipeline: a denoising module, which both alleviates the light transport simulation cost, and also helps finding a better minimum; and an optimization process based on learned gradient descent, which enables better convergence using fewer iterations. Extensive experiments demonstrate that we significantly outperform the current state-of-the-art in both computational speed and final surface error.Item PlenopticPoints: Rasterizing Neural Feature Points for High-Quality Novel View Synthesis(The Eurographics Association, 2023) Hahlbohm, Florian; Kappel, Moritz; Tauscher, Jan-Philipp; Eisemann, Martin; Magnor, Marcus; Guthe, Michael; Grosch, ThorstenThis paper presents a point-based, neural rendering approach for complex real-world objects from a set of photographs. Our method is specifically geared towards representing fine detail and reflective surface characteristics at improved quality over current state-of-the-art methods. From the photographs, we create a 3D point model based on optimized neural feature points located on a regular grid. For rendering, we employ view-dependent spherical harmonics shading, differentiable rasterization, and a deep neural rendering network. By combining a point-based approach and novel regularizers, our method is able to accurately represent local detail such as fine geometry and high-frequency texture while at the same time convincingly interpolating unseen viewpoints during inference. Our method achieves about 7 frames per second at 800×800 pixel output resolution on commodity hardware, putting it within reach for real-time rendering applications.