Title
Estimating Gaze From Head And Hand Pose And Scene Images For Open-Ended Exploration In Vr Environments
Abstract
The widespread utility of eye tracking technology has created a growing demand for more consistent and reliable eye-tracking systems, and there is a need for new and accessible approaches that can enhance the accuracy of eye-tracking data. Previous studies have offered evidence for associations between certain non-eye signals and gaze such as a strong coordination between head motion and gaze shifts. e.g. [3] , hand and eye spatiotemporal statistics, e.g. [7] , and gaze behavior and scene content, e.g. [2] . Previous studies have also shown how various combinations of eye, head, scene, and hand signals can be leveraged for applications such as gaze estimation [5] , [10] , prediction [8] , and classification [6] . Though these previous approaches provide support for the idea that non-eye sensors (i.e. head, hand, and scene) are useful for estimating gaze, they have not yet fully addressed how these signals individually and in combination contribute to gaze estimation.
Year
DOI
Venue
2021
10.1109/VRW52623.2021.00159
2021 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2021)
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Kara J. Emery100.68
Marina Zannoli2112.58
Lei Xiao300.68
James Warren400.68
Sachin S. Talathi500.68