EG 2015 - Short Papers
Permanent URI for this collection
Browse
Browsing EG 2015 - Short Papers by Subject "and texture"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Adaptive LightSlice for Virtual Ray Lights(The Eurographics Association, 2015) Frederickx, Roald; Bartels, Pieterjan; Dutré, Philip; B. Bickel and T. RitschelWe speed up the rendering of participating media with Virtual Ray Lights (VRLs) by clustering them in a preprocessing step. A subset of representative VRLs is then sampled from the clustering, which is used for the final rendering. By performing a full variance analysis, we can explicitly estimate the convergence rate of the rendering process and automatically find the locally ideal number of clusters to maximize efficiency. Overall, we report speed-up factors ranging from 13 to 16 compared to unclustered rendering.Item High-Quality Shadows for Streaming Terrain Rendering(The Eurographics Association, 2015) Chajdas, Matthäus G.; Reichl, Florian; Dick, Christian; Westermann, Rüdiger; B. Bickel and T. RitschelRendering of large, detailed 3D terrains on commodity hardware has become possible through the use of raycasting, data caching and prefetching. Adding dynamic shadows as they appear during a day-night cycle remais a challenge however, because shadow rendering requires access to the entire terrain, invalidating data streaming strategies. In this work we present a novel, practicable shadow rendering approach which distinguishes between near- and precomputed far-shadows to significantly reduce data access and runtime costs. While near-shadows are raytraced using the current cache content, far-shadows are precomputed and stored in a very compact format requiring approximately 3 bit per height-map sample for an entire day-night cycle.Item Interactive HDR Environment Map Capturing on Mobile Devices(The Eurographics Association, 2015) Kán, Peter; B. Bickel and T. RitschelReal world illumination, captured by digitizing devices, is beneficial to solve many problems in computer graphics. Therefore, practical methods for capturing this illumination are of high interest. In this paper, we present a novel method for capturing environmental illumination by a mobile device. Our method is highly practical as it requires only a consumer mobile phone and the result can be instantly used for rendering or material estimation.We capture the real light in high dynamic range (HDR) to preserve its high contrast. Our method utilizes the moving camera of a mobile phone in auto-exposure mode to reconstruct HDR values. The projection of the image to the spherical environment map is based on the orientation of the mobile device. Both HDR reconstruction and projection run on the mobile GPU to enable interactivity. Moreover, an additional image alignment step is performed. Our results show that the presented method faithfully captures the real environment and that the rendering with our reconstructed environment maps achieves high quality, comparable to reality.