Gaze-driven Object Tracking for Real Time Rendering
Loading...
Date
2013
Journal Title
Journal ISSN
Volume Title
Publisher
The Eurographics Association and Blackwell Publishing Ltd.
Abstract
To efficiently deploy eye-tracking within 3D graphics applications, we present a new probabilistic method that predicts the patterns of user's eye fixations in animated 3D scenes from noisy eye-tracker data. The proposed method utilises both the eye-tracker data and the known information about the 3D scene to improve the accuracy, robustness and stability. Eye-tracking can thus be used, for example, to induce focal cues via gaze-contingent depth-of-field rendering, add intuitive controls to a video game, and create a highly reliable scene-aware saliency model. The computed probabilities rely on the consistency of the gaze scan-paths to the position and velocity of a moving or stationary target. The temporal characteristic of eye fixations is imposed by a Hidden Markov model, which steers the solution towards the most probable fixation patterns. The derivation of the algorithm is driven by the data from two eye-tracking experiments: the first experiment provides actual eye-tracker readings and the position of the target to be tracked. The second experiment is used to derive a JND-scaled (Just Noticeable Difference) quality metric that quantifies the perceived loss of quality due to the errors of the tracking algorithm. Data from both experiments are used to justify design choices, and to calibrate and validate the tracking algorithms. This novel method outperforms commonly used fixation algorithms and is able to track objects smaller then the nominal error of an eye-tracker.
Description
@article{10.1111:cgf.12036,
journal = {Computer Graphics Forum},
title = {{Gaze-driven Object Tracking for Real Time Rendering}},
author = {Mantiuk, Radoslaw and Bazyluk, Bartosz and Mantiuk, Rafal K.},
year = {2013},
publisher = {The Eurographics Association and Blackwell Publishing Ltd.},
ISSN = {1467-8659},
DOI = {10.1111/cgf.12036}
}