ICAT-EGVE2017
Permanent URI for this collection
Browse
Browsing ICAT-EGVE2017 by Subject "Camera calibration"
Now showing 1 - 1 of 1
Results Per Page
Sort Options
Item Real-time Ambient Fusion of Commodity Tracking Systems for Virtual Reality(The Eurographics Association, 2017) Fountain, Jake; Smith, Shamus P.; Robert W. Lindeman and Gerd Bruder and Daisuke IwaiCross-compatibility of virtual reality devices is limited by the difficulty of alignment and fusion of data between systems. In this paper, a plugin for ambiently aligning the reference frames of virtual reality tracking systems is presented. The core contribution consists of a procedure for ambient calibration. The procedure describes ambient behaviors for data gathering, system calibration and fault detection. Data is ambiently collected from in-application self-directed movements, and calibration is automatically performed between dependent sensor systems. Sensor fusion is then performed by taking the most accurate data for a given body part amongst all systems. The procedure was applied to aligning a Kinect v2 with an HTC Vive and an Oculus Rift in a variety of common virtual reality scenarios. The results were compared to alignment performed with a gold standard OptiTrack motion capture system. Typical results were 20cm and 4 of error compared to the ground truth, which compares favorably with the accepted accuracy of the Kinect v2. Data collection for full calibration took on average 13 seconds of inapplication, self-directed movement. This work represents an essential development towards plug-and-play sensor fusion for virtual reality technology.