Title
Gaze-directed hands-free interface for mobile interaction
Abstract
While mobile devices have allowed people to carry out various computing and communication tasks everywhere, it has generally lacked the support for task execution while the user is in motion. This is because the interaction schemes of most mobile applications are centered around the device visual display and when in motion (with the important body parts, such as the head and hands, moving), it is difficult for the user to recognize the visual output on the small hand-carried device display and respond to make the timely and proper input. In this paper, we propose an interface which allows the user to interact with the mobile devices during motion without having to look at it or use one's hands. More specifically, the user interacts, by gaze and head motion gestures, with an invisible virtual interface panel with the help of a head-worn gyro sensor and aural feedback. Since the menu is one of the most prevailing methods of interaction, we investigate and focus on the various forms of menu presentation such as the layout and the number of comfortably selectable menu items. With head motion, it turns out 4×2 or 3×3 grid menu is more effective. The results of this study can be further extended for developing a more sophisticated non-visual oriented mobile interface.
Year
DOI
Venue
2011
10.1007/978-3-642-21605-3_34
HCI (2)
Keywords
Field
DocType
menu presentation,gaze-directed hands-free interface,invisible virtual interface panel,grid menu,head motion,mobile device,user interacts,mobile application,mobile interface,head motion gesture,mobile interaction,selectable menu item
Gaze,Gesture,Computer science,Human–computer interaction,Mobile device,Mobile interfaces,Mobile interaction,Multimedia,Natural user interface,Grid
Conference
Volume
ISSN
Citations 
6762
0302-9743
1
PageRank 
References 
Authors
0.35
12
3
Name
Order
Citations
PageRank
Gie-seo Park160.89
Jong-gil Ahn254.21
Gerard J. Kim323638.45