Abstract | ||
---|---|---|
Gaze interaction holds a lot of promise for seamless human-computer interaction. At the same time, current wearable mobile eye trackers require user augmentation that negatively impacts natural user behavior while remote trackers require users to position themselves within a confined tracking range. We present GazeDrone, the first system that combines a camera-equipped aerial drone with a computational method to detect sidelong glances for spontaneous (calibration-free) gaze-based interaction with surrounding pervasive systems (e.g., public displays). GazeDrone does not require augmenting each user with on-body sensors and allows interaction from arbitrary positions, even while moving. We demonstrate that drone-supported gaze interaction is feasible and accurate for certain movement types. It is well-perceived by users, in particular while interacting from a fixed position as well as while moving orthogonally or diagonally to a display. We present design implications and discuss opportunities and challenges for drone-supported gaze interaction in public.
|
Year | DOI | Venue |
---|---|---|
2018 | 10.1145/3213526.3213539 | MobiSys '18: The 16th Annual International Conference on Mobile Systems, Applications, and Services
Munich
Germany
June, 2018 |
DocType | ISBN | Citations |
Conference | 978-1-4503-5839-2 | 0 |
PageRank | References | Authors |
0.34 | 19 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mohamed Khamis | 1 | 218 | 36.51 |
Anna Kienle | 2 | 0 | 0.34 |
Florian Alt | 3 | 1552 | 119.24 |
Andreas Bulling | 4 | 2279 | 133.41 |