Title | ||
---|---|---|
Evaluating multimodal interaction with gestures and speech for point and select tasks |
Abstract | ||
---|---|---|
Natural interactions such as speech and gestures have achieved mainstream success independently, with consumer products such as Leap Motion popularizing gestures, while mobile phones have embraced speech input. In this paper we designed an interaction style that combines both gestures and speech to evaluate point and select interaction. Our results indicate that while gestures are slower than the mouse, the introduction of speech allows for selection to be performed without negatively impacting navigation. We also found that users can adapt to this interaction quickly and are able to improve their performance with minimal training. This lays the foundation for future work, such as mouse replacement technologies for those with hand impairments. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1145/2639189.2670267 | NordiCHI |
Keywords | Field | DocType |
gestural interaction,multimodal interaction,speech input,user interfaces | Multimodal interaction,Computer science,Gesture,Leap motion,Speech recognition,Human–computer interaction,Multimedia | Conference |
Citations | PageRank | References |
1 | 0.36 | 6 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Alvin Jude | 1 | 9 | 1.29 |
G. Michael Poor | 2 | 58 | 10.00 |
Darren Guinness | 3 | 42 | 6.90 |