Abstract | ||
---|---|---|
In this paper we describe a new method for determining gaze depth in a head mounted eye-tracker. Eye-trackers are being incorporated into head mounted displays (HMDs), and eye-gaze is being used for interaction in Virtual and Augmented Reality. For some interaction methods, it is important to accurately measure the x-and y-direction of the eye-gaze and especially the focal depth information. Generally, eye tracking technology has a high accuracy in x-and y-directions, but not in depth. We used a binocular gaze tracker with two eye cameras, and the gaze vector was input to an MLP neural network for training and estimation. For the performance evaluation, data was obtained from 13 people gazing at fixed points at distances from 1m to 5m. The gaze classification into fixed distances produced an average classification error of nearly 10%, and an average error distance of 0.42m. This is sufficient for some Augmented Reality applications, but more research is needed to provide an estimate of a user's gaze moving in continuous space. |
Year | DOI | Venue |
---|---|---|
2017 | 10.1109/ISUVR.2017.13 | 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR) |
Keywords | Field | DocType |
Eye-gaze,3D gaze,Machine Learning,Augmented Reality,Head-mounted display | Depth of focus,Computer vision,Gaze,Computer science,Augmented reality,Eye tracking,Multilayer perceptron,Artificial intelligence,Fixed point,Artificial neural network,Calibration | Conference |
ISBN | Citations | PageRank |
978-1-5386-3092-1 | 0 | 0.34 |
References | Authors | |
9 | 9 |
Name | Order | Citations | PageRank |
---|---|---|---|
Youngho Lee | 1 | 123 | 17.72 |
Choonsung Shin | 2 | 225 | 14.64 |
Alexander Plopski | 3 | 60 | 17.15 |
Yuta Itoh | 4 | 222 | 25.69 |
Thammathip Piumsomboon | 5 | 201 | 18.15 |
Arindam Dey | 6 | 205 | 23.43 |
Gun Lee | 7 | 543 | 56.29 |
Seung-Won Kim | 8 | 132 | 18.51 |
Mark Billinghurst | 9 | 5357 | 542.78 |