Abstract | ||
---|---|---|
This paper describes the results of two studies conducted to determine the role of visual cues for a collaborative navigation task in a mixed-space environment. Both studies required a user with an exocentric view of a virtual room to navigate a fully immersed user with an egocentric view to an exit. The first study compares natural hand-based gestures, a mouse-based interface and an audio-only technique to determine their relative efficiency on task completion times. The follow-up study compares natural hand-based gestures against a mouse-based interface in a scenario in which participants are unable to communicate verbally. The results show that visual cue-based collaborative navigation techniques are significantly more efficient than an audio-only technique. The results also show that natural hand gestures are more expressive and lead to quicker completion times in situations where verbal communication is not possible. |
Year | Venue | Keywords |
---|---|---|
2009 | AUIC | mixed-space collaborative navigation,audio-only technique,task completion time,collaborative navigation task,mouse-based interface,natural hand gesture,follow-up study,natural hand-based gesture,exocentric view,quicker completion time,egocentric view |
DocType | Citations | PageRank |
Conference | 1 | 0.35 |
References | Authors | |
14 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Aaron Stafford | 1 | 38 | 3.42 |
Bruce H. Thomas | 2 | 1723 | 201.93 |
Wayne Piekarski | 3 | 821 | 97.80 |