Rendering - Experimental Ideas & Implementations
Permanent URI for this community
Browse
Browsing Rendering - Experimental Ideas & Implementations by Subject "Antialiasing"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Practical Temporal and Stereoscopic Filtering for Real-time Ray Tracing(The Eurographics Association, 2023) Philippi, Henrik; Frisvad, Jeppe Revall; Jensen, Henrik Wann; Ritschel, Tobias; Weidlich, AndreaWe present a practical method for temporal and stereoscopic filtering that generates stereo-consistent rendering. Existing methods for stereoscopic rendering often reuse samples from one eye for the other or do averaging between the two eyes. These approaches fail in the presence of ray tracing effects such as specular reflections and refractions. We derive a new blending strategy that leverages variance to compute per pixel blending weights for both temporal and stereoscopic rendering. In the temporal domain, our method works well in a low noise context and is robust in the presence of inconsistent motion vectors, where existing methods such as temporal anti-aliasing (TAA) and deep learning super sampling (DLSS) produce artifacts. In the stereoscopic domain, our method provides a new way to ensure consistency between the left and right eyes. The stereoscopic version of our method can be used with our new temporal method or with existing methods such as DLSS and TAA. In all combinations, it reduces the error and significantly increases the consistency between the eyes making it practical for real-time settings such as virtual reality (VR).Item Real-Time Hybrid Hair Rendering(The Eurographics Association, 2019) Jansson, Erik Sven Vasconcelos; Chajdas, Matthäus G.; Lacroix, Jason; Ragnemalm, Ingemar; Boubekeur, Tamy and Sen, PradeepRendering hair is a challenging problem for real-time applications. Besides complex shading, the sheer amount of it poses a lot of problems, as a human scalp can have over 100,000 strands of hair, with animal fur often surpassing a million. For rendering, both strand-based and volume-based techniques have been used, but usually in isolation. In this work, we present a complete hair rendering solution based on a hybrid approach. The solution requires no pre-processing, making it a drop-in replacement, that combines the best of strand-based and volume-based rendering. Our approach uses this volume not only as a level-of-detail representation that is raymarched directly, but also to simulate global effects, like shadows and ambient occlusion in real-time.Item Spatio-Temporal Dithering for Order-Independent Transparency on Ray Tracing Hardware(The Eurographics Association, 2025) Brüll, Felix; Kern, René; Grosch, Thorsten; Wang, Beibei; Wilkie, AlexanderEfficient rendering of many transparent surfaces is a challenging problem in real-time ray tracing. We introduce an alternative approach to conventional order-independent transparency (OIT) techniques: our method interprets the alpha channel as coverage and uses state-of-the-art temporal anti-aliasing techniques to accumulate transparency over multiple frames. By efficiently utilizing ray tracing hardware and its early ray termination capabilities, our method reduces computational costs compared to conventional OIT methods. Furthermore, our approach shades only one fragment per pixel, significantly lowering the shading workload and improving frame rate stability. Despite relying on temporal accumulation, our technique performs well in dynamic scenes.