Capturing and Rendering With Incident Light Fields
dc.contributor.author | Unger, J. | en_US |
dc.contributor.author | Wenger, A. | en_US |
dc.contributor.author | Hawkins, T. | en_US |
dc.contributor.author | Gardner, A. | en_US |
dc.contributor.author | Debevec, P. | en_US |
dc.contributor.editor | Philip Dutre and Frank Suykens and Per H. Christensen and Daniel Cohen-Or | en_US |
dc.date.accessioned | 2014-01-27T14:22:46Z | |
dc.date.available | 2014-01-27T14:22:46Z | |
dc.date.issued | 2003 | en_US |
dc.description.abstract | This paper presents a process for capturing spatially and directionally varying illumination from a real-world scene and using this lighting to illuminate computer-generated objects. We use two devices for capturing such illumination. In the first we photograph an array of mirrored spheres in high dynamic range to capture the spatially varying illumination. In the second, we obtain higher resolution data by capturing images with an high dynamic range omnidirectional camera as it traverses across a plane. For both methods we apply the light field technique to extrapolate the incident illumination to a volume. We render computer-generated objects as illuminated by this captured illumination using a custom shader within an existing global illumination rendering system. To demonstrate our technique we capture several spatially-varying lighting environments with spotlights, shadows, and dappled lighting and use them to illuminate synthetic scenes. We also show comparisons to real objects under the same illumination. | en_US |
dc.description.seriesinformation | Eurographics Workshop on Rendering | en_US |
dc.identifier.isbn | 3-905673-03-7 | en_US |
dc.identifier.issn | 1727-3463 | en_US |
dc.identifier.uri | https://doi.org/10.2312/EGWR/EGWR03/141-149 | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.title | Capturing and Rendering With Incident Light Fields | en_US |
Files
Original bundle
1 - 1 of 1