Abstract | ||
---|---|---|
This paper describes the results of a study conducted to determine the efficiency of visual cues for a collaborative navigation task in a mixed-space environment. The task required a user with an exocentric view of a virtual room to navigate a fully immersed user with an egocentric view to an exit. The study compares natural hand-based gestures, a mouse-based interface and an audio only technique to determine their relative efficiency on task completion times. The results show that visual cue-based collaborative navigation techniques are significantly more efficient than an audio-only technique. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1109/ISMAR.2008.4637356 | Cambridge |
Keywords | Field | DocType |
mixed-space collaborative navigation,audio-only technique,relative efficiency,task completion time,mouse-based interface,collaborative navigation task,visual cue-based collaborative navigation,visual cue,mixed-space environment,exocentric view,egocentric view,groupware,collaboration,augmented reality,user interfaces,visual cues,interaction technique,visualization,computer graphic,virtual reality,navigation | Gesture,Computer science,Augmented reality,Human–computer interaction,Artificial intelligence,Task completion,Sensory cue,Computer vision,Visualization,Collaborative software,Endocentric and exocentric,User interface,Multimedia | Conference |
ISSN | ISBN | Citations |
1554-7868 | 978-1-4244-2859-5 | 2 |
PageRank | References | Authors |
0.44 | 3 | 3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Aaron Stafford | 1 | 38 | 3.42 |
Bruce H. Thomas | 2 | 1723 | 201.93 |
Wayne Piekarski | 3 | 821 | 97.80 |