Abstract | ||
---|---|---|
As wearable devices move toward the face (i.e. smart earbuds, glasses), there is an increasing need to facilitate intuitive interactions with these devices. Current sensing techniques can already detect many mouth-based gestures; however, users' preferences of these gestures are not fully understood. In this paper, we investigate the design space and usability of mouth-based microgestures. We first conducted brainstorming sessions (N=16) and compiled an extensive set of 86 user-defined gestures. Then, with an online survey (N=50), we assessed the physical and mental demand of our gesture set and identified a subset of 14 gestures that can be performed easily and naturally. Finally, we conducted a remote Wizard-of-Oz usability study (N=11) mapping gestures to various daily smartphone operations under a sitting and walking context. From these studies, we develop a taxonomy for mouth gestures, finalize a practical gesture set for common applications, and provide design guidelines for future mouth-based gesture interactions. |
Year | DOI | Venue |
---|---|---|
2021 | 10.1145/3461778.3462004 | PROCEEDINGS OF THE 2021 ACM DESIGNING INTERACTIVE SYSTEMS CONFERENCE (DIS 2021) |
Keywords | DocType | Citations |
Mouth microgesture, interaction techniques, user-designed gestures, design space | Conference | 0 |
PageRank | References | Authors |
0.34 | 0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Victor Chen | 1 | 0 | 0.34 |
Xuhai Xu | 2 | 10 | 2.81 |
Richard Y. M. Li | 3 | 36 | 9.97 |
Yuanchun Shi | 4 | 1005 | 135.34 |
Shwetak N. Patel | 5 | 2967 | 211.74 |
Yuntao Wang | 6 | 105 | 23.69 |