Abstract | ||
---|---|---|
This paper presents a study that allows users to define intuitive gestures to navigate a humanoid robot. For eleven navigational commands, 385 gestures, performed by 35 participants, were analyzed. The results of the study reveal user-defined gesture sets for both novice users and expert users. In addition, we present, a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, time performances of the gesture motions, and present implications to the design of the robot control, with a focus on recognition and user interfaces. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1007/978-3-642-34103-8_37 | ICSR |
Keywords | Field | DocType |
eleven navigational command,gesture set,robot control,user-defined gesture set,gesture motion,agreement score,present implication,navigational control,user-defined body gesture,humanoid robot,intuitive gesture,expert user | Robot control,Computer vision,Computer science,Gesture,Gesture recognition,Artificial intelligence,User interface,Humanoid robot | Conference |
Citations | PageRank | References |
13 | 0.56 | 10 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mohammad Obaid | 1 | 177 | 28.40 |
Markus Häring | 2 | 81 | 6.43 |
Felix Kistler | 3 | 156 | 10.52 |
René Bühling | 4 | 48 | 6.90 |
Elisabeth André | 5 | 3634 | 433.65 |