Title
Using multisensory cues for direction information in teleoperation: More is not always better.
Abstract
When full automation of mobile robots is not possible or desirable, teleoperation constitutes an alternative. The human operator can be supported with direction cues to facilitate localization or navigation. These cues are presented typically in the auditory, haptic and/or visual modality. An experiment was conducted to evaluate systematically and empirically the (uni-modal and multi-modal) effects of auditory and haptic feedback compared to visual feedback on target localization accuracy. Results show that haptic as well as auditory direction cues lead to significantly lower accuracy than visual cues. Moreover, combining feedback cues does not necessarily lead to better performance and can even reduce accuracy. Based on the results, possible implications for multi-modal human machine interface design are discussed.
Year
DOI
Venue
2017
10.1109/ICRA.2017.7989773
ICRA
Field
DocType
Volume
Teleoperation,Sensory cue,Computer vision,Visualization,Computer science,Automation,Human–machine interface,Artificial intelligence,Robot,Mobile robot,Haptic technology
Conference
2017
Issue
Citations 
PageRank 
1
0
0.34
References 
Authors
20
2
Name
Order
Citations
PageRank
T. M. Benz100.34
Verena Nitsch285.93