Real-time Reflective and Refractive Novel-view Synthesis
dc.contributor.author | Lochmann, Gerrit | en_US |
dc.contributor.author | Reinert, Bernhard | en_US |
dc.contributor.author | Ritschel, Tobias | en_US |
dc.contributor.author | Müller, Stefan | en_US |
dc.contributor.author | Seidel, Hans-Peter | en_US |
dc.contributor.editor | Jan Bender and Arjan Kuijper and Tatiana von Landesberger and Holger Theisel and Philipp Urban | en_US |
dc.date.accessioned | 2014-12-16T07:25:55Z | |
dc.date.available | 2014-12-16T07:25:55Z | |
dc.date.issued | 2014 | en_US |
dc.description.abstract | We extend novel-view image synthesis from the common diffuse and opaque image formation model to the reflective and refractive case. Our approach uses a ray tree of RGBZ images, where each node contains one RGB light path which is to be warped differently depending on the depth Z and the type of path. Core of our approach are two efficient procedures for reflective and refractive warping. Different from the diffuse and opaque case, no simple direct solution exists for general geometry. Instead, a per-pixel optimization in combination with informed initial guesses warps an HD image with reflections and refractions in 18 ms on a current mobile GPU. The key application is latency avoidance in remote rendering in particular for head-mounted displays. Other applications are single-pass stereo or multi-view, motion blur and depth-of-field rendering as well as their combinations. | en_US |
dc.description.seriesinformation | Vision, Modeling & Visualization | en_US |
dc.identifier.isbn | 978-3-905674-74-3 | en_US |
dc.identifier.uri | https://doi.org/10.2312/vmv.20141270 | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.title | Real-time Reflective and Refractive Novel-view Synthesis | en_US |