Abstract | ||
---|---|---|
This paper presents a framework that allows users to interact with and navigate a humanoid robot using body gestures. The first part of the paper describes a study to define intuitive gestures for eleven navigational commands based on analyzing 385 gestures performed by 35 participants. From the study results, we present a taxonomy of the user-defined gesture sets, agreement scores for the gesture sets, and time performances of the gesture motions. The second part of the paper presents a full body interaction system for recognizing the user-defined gestures. We evaluate the system by recruiting 22 participants to test for the accuracy of the proposed system. The results show that most of the defined gestures can be successfully recognized with a precision between 86100 % and an accuracy between 7396 %. We discuss the limitations of the system and present future work improvements. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1007/s12369-014-0233-3 | I. J. Social Robotics |
Keywords | Field | DocType |
Humanoid robot,Robot,Nao,Gesture,User-defined,User-defined gestures,Robot navigation,Gesture recognition | Computer vision,Simulation,Gesture,Gesture recognition,Psychology,Artificial intelligence,Robot,Humanoid robot | Journal |
Volume | Issue | ISSN |
6 | 3 | 1875-4791 |
Citations | PageRank | References |
3 | 0.37 | 34 |
Authors | ||
5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Mohammad Obaid | 1 | 177 | 28.40 |
Felix Kistler | 2 | 156 | 10.52 |
Markus Häring | 3 | 81 | 6.43 |
René Bühling | 4 | 48 | 6.90 |
Elisabeth André | 5 | 3634 | 433.65 |