Efficient Rendering of Participating Media for Multiple Viewpoints
No Thumbnail Available
Date
2023
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
Achieving realism in modern games requires the integration of participating media effects, such as fog, dust, and smoke. However, due to the complex nature of scattering and partial occlusions within these media, real-time rendering of high-quality participating media remains a computational challenge. To address this challenge, traditional approaches of real-time participating media rendering involve storing temporary results in a view-aligned grid before ray marching through these cached values. In this paper, we investigate alternative hybrid worldand view-aligned caching methods that allow for the sharing of intermediate computations across cameras in a scene. This approach is particularly relevant for multi-camera setups, such as stereo rendering for VR and AR, local split-screen games, or cloud-based rendering for game streaming, where a large number of players may be in the same location. Our approach relies on a view-aligned grid for near-field computations, which enables us to capture high-frequency shadows in front of a viewer. Additionally, we use a world-space caching structure to selectively activate distant computations based on each viewer's visibility, allowing for the sharing of computations and maintaining high visual quality. The results of our evaluation demonstrate computational savings of up to 50% or more, without compromising visual quality.
Description
CCS Concepts: Computing methodologies -> Rendering; Ray tracing
@inproceedings{10.2312:hpg.20231136,
booktitle = {High-Performance Graphics - Symposium Papers},
editor = {Bikker, Jacco and Gribble, Christiaan},
title = {{Efficient Rendering of Participating Media for Multiple Viewpoints}},
author = {Stojanovic, Robert and Weinrauch, Alexander and Tatzgern, Wolfgang and Kurz, Andreas and Steinberger, Markus},
year = {2023},
publisher = {The Eurographics Association},
ISSN = {2079-8687},
ISBN = {978-3-03868-229-5},
DOI = {10.2312/hpg.20231136}
}