Using Time-of-Flight Range Data for Occlusion Handling in Augmented Reality

Loading...
Thumbnail Image
Date
2007
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association
Abstract
One of the main problems of monoscopic video see-through augmented reality (AR) is the lack of reliable depth information. This makes it difficult to correctly represent complex spatial interactions between real and virtual objects, e.g., when rendering shadows. The most obvious graphical artifact is the incorrect display of the occlusion of virtual models by real objects. Since the graphical models are rendered opaquely over the camera image, they always appear to occlude all objects in the real environment, regardless of the actual spatial relationship. In this paper, we propose to utilize a new type of hardware in order to solve some of the basic challenges of AR rendering. We introduce a depth-of-flight range sensor into AR, which produces a 2D map of the distances to real objects in the environment. The distance map is registered with high resolution color images delivered by a digital video camera. When displaying the virtual models in AR, the distance map is used in order to decide whether the camera image or the virtual object is visible at any position. This way, the occlusion of virtual models by real objects can be correctly represented. Preliminary results obtained with our approach show that a useful occlusion handling based on time-of-flight range data is possible.
Description

        
@inproceedings{
10.2312:EGVE/IPT_EGVE2007/109-116
, booktitle = {
Eurographics Symposium on Virtual Environments
}, editor = {
Bernd Froehlich and Roland Blach and Robert van Liere
}, title = {{
Using Time-of-Flight Range Data for Occlusion Handling in Augmented Reality
}}, author = {
Fischer, Jan
and
Huhle, Benjamin
and
Schilling, Andreas
}, year = {
2007
}, publisher = {
The Eurographics Association
}, ISSN = {
1727-530X
}, ISBN = {
978-3-905674-02-6
}, DOI = {
10.2312/EGVE/IPT_EGVE2007/109-116
} }
Citation