Abstract | ||
---|---|---|
In recent years, an attempt is being made to control robots more intuitive and intelligible by exploiting and integrating anthropomorphic features to boost social human-robot interaction. The design and construction of anthropomorphic robots for this kind of interaction is not the only challenging issue -- smooth and expectation-matching motion control is still an unsolved topic. In this work we present a highly configurable, portable, and open control framework that facilitates anthropomorphic motion generation for humanoid robot heads by enhancing state-of-the-art neck-eye coordination with human-like eyelid saccades and animation. On top of that, the presented framework supports dynamic neck offset angles that allow animation overlays and changes in alignment to the robots communication partner whileretaining visual focus on a given target. In order to demonstrate the universal applicability of the proposed ideas we used this framework to control the Flobi and the iCub robot head, both in simulation and on the physical robot. In order to foster further comparative studies of different robot heads, we will release all software, based on this contribution, under an open-source license. |
Year | DOI | Venue |
---|---|---|
2016 | 10.1145/2974804.2974827 | HAI |
Field | DocType | Citations |
Computer vision,Social robot,Robot control,iCub,Computer science,Simulation,Personal robot,Animation,Artificial intelligence,Robot,Mobile robot,Humanoid robot | Conference | 0 |
PageRank | References | Authors |
0.34 | 4 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Simon Schulz | 1 | 11 | 1.48 |
Florian Lier | 2 | 24 | 7.48 |
Andreas Kipp | 3 | 1 | 1.38 |
Sven Wachsmuth | 4 | 267 | 43.83 |