Computers & Graphics

SalientGaze: Saliency-based gaze correction in virtual reality

Peiteng Shi, Markus Billeter, and Elmar Eisemann

Eye-tracking with gaze estimation is a key element in many applications, ranging from foveated rendering and user interaction to behavioural analysis and usage metrics. For virtual reality, eye-tracking typically relies on near-eye cameras that are mounted in the VR headset. Such methods usually involve an initial calibration to create a mapping from eye features to a gaze position. However, the accuracy based on the initial calibration degrades when the position of the headset relative to the users' head changes; this is especially noticeable when users readjust the headset for comfort or even completely remove it for a short while. We show that a correction of such shifts can be achieved via 2D drift vectors in eye space. Our method estimates these drifts by extracting salient cues from the shown virtual environment to determine potential gaze directions. Our solution can compensate for HMD shifts, even those arising from taking off the headset, which enables us to eliminate reinitialization steps.


More Information

Gallery

Citation

Peiteng Shi, Markus Billeter, and Elmar Eisemann, SalientGaze: Saliency-based gaze correction in virtual reality, Computers & Graphics, 91, pp. 83–94, 2020.

BibTex

@article{bib:shi:2020,
    author       = { Shi, Peiteng and Billeter, Markus and Eisemann, Elmar },    
    title        = { SalientGaze: Saliency-based gaze correction in virtual reality },
    journal      = { Computers & Graphics },
    volume       = { 91 },
    year         = { 2020 },
    pages        = { 83--94 },
    note         = { https://doi.org/10.1016/j.cag.2020.06.007 },
    doi          = { 10.1016/j.cag.2020.06.007 },
    dblp         = { journals/cg/ShiBE20 },
    url          = { https://publications.graphics.tudelft.nl/papers/166 },
}