Browsing by Author "Grogorick, Steve"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Immersive Free‐Viewpoint Panorama Rendering from Omnidirectional Stereo Video(© 2023 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd., 2023) Mühlhausen, Moritz; Kappel, Moritz; Kassubeck, Marc; Wöhler, Leslie; Grogorick, Steve; Castillo, Susana; Eisemann, Martin; Magnor, Marcus; Hauser, Helwig and Alliez, PierreIn this paper, we tackle the challenging problem of rendering real‐world 360° panorama videos that support full 6 degrees‐of‐freedom (DoF) head motion from a prerecorded omnidirectional stereo (ODS) video. In contrast to recent approaches that create novel views for individual panorama frames, we introduce a video‐specific temporally‐consistent multi‐sphere image (MSI) scene representation. Given a conventional ODS video, we first extract information by estimating framewise descriptive feature maps. Then, we optimize the global MSI model using theory from recent research on neural radiance fields. Instead of a continuous scene function, this multi‐sphere image (MSI) representation depicts colour and density information only for a discrete set of concentric spheres. To further improve the temporal consistency of our results, we apply an ancillary refinement step which optimizes the temporal coherency between successive video frames. Direct comparisons to recent baseline approaches show that our global MSI optimization yields superior performance in terms of visual quality. Our code and data will be made publicly available.Item Optimizing Temporal Stability in Underwater Video Tone Mapping(The Eurographics Association, 2023) Franz, Matthias; Thang, B. Matthias; Sackhoff, Pascal; Scholz, Timon; Möller, Jannis; Grogorick, Steve; Eisemann, Martin; Guthe, Michael; Grosch, ThorstenIn this paper, we present an approach for temporal stabilization of depth-based underwater image tone mapping methods for application to monocular RGB video. Typically, the goal is to improve the colors of focused objects, while leaving more distant regions nearly unchanged, to preserve the underwater look-and-feel of the overall image. To do this, many methods rely on estimated depth to control the recolorization process, i.e., to enhance colors (reduce blue tint) only for objects close to the camera. However, while single-view depth estimation is usually consistent within a frame, it often suffers from inconsistencies across sequential frames, resulting in color fluctuations during tone mapping. We propose a simple yet effective inter-frame stabilization of the computed depth maps to achieve stable tone mapping results. The evaluation of eight test sequences shows the effectiveness in a wide range of underwater scenarios.Item Stereo Inverse Brightness Modulation for Guidance in Dynamic Panorama Videos in Virtual Reality(© 2020 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd, 2020) Grogorick, Steve; Tauscher, Jan‐Philipp; Heesen, Nikkel; Castillo, Susana; Magnor, Marcus; Benes, Bedrich and Hauser, HelwigThe peak of virtual reality offers new exciting possibilities for the creation of media content but also poses new challenges. Some areas of interest might be overlooked because the visual content fills up a large portion of viewers' visual field. Moreover, this content is available in 360° around the viewer, yielding locations completely out of sight, making, for example, recall or storytelling in cinematic Virtual Reality (VR) quite difficult.In this paper, we present an evaluation of Stereo Inverse Brightness Modulation for effective and subtle guidance of participants' attention while navigating dynamic virtual environments. The used technique exploits the binocular rivalry effect from human stereo vision and was previously shown to be effective in static environments. Moreover, we propose an extension of the method for successful guidance towards target locations outside the initial visual field.We conduct three perceptual studies, using 13 distinct panorama videos and two VR systems (a VR head mounted display and a fully immersive dome projection system), to investigate (1) general applicability to dynamic environments, (2) stimulus parameter and VR system influence, and (3) effectiveness of the proposed extension for out‐of‐sight targets. Our results prove the applicability of the method to dynamic environments while maintaining its unobtrusive appearance.