Abstract | ||
---|---|---|
In this work, we are interested in understanding how emotional interactions with a social partner can bootstrap increasingly complex behaviors such as social referencing. Our idea is that social referencing, facial expression recognition and the joint attention can emerge from a simple sensori-motor architecture. Without knowing that the other is an agent, we show our robot is able to learn some complex tasks if the human partner has a low level emotional resonance with the robot head. Hence we advocate the idea that social referencing can be bootstrapped from a simple sensori-motor system not dedicated to social interactions. |
Year | DOI | Venue |
---|---|---|
2011 | 10.1109/DEVLRN.2011.6037317 | ICDL), 2011 IEEE International Conference |
Keywords | Field | DocType |
emotion recognition,human-robot interaction,learning (artificial intelligence),robot vision,emotional interaction,emotional resonance,facial expression recognition,joint attention,robot head,robot learning,sensori-motor architecture,social partner,social referencing,task learning,robots,surface acoustic wave,learning artificial intelligence,face,human robot interaction | Robot learning,Social robot,Architecture,Joint attention,Facial expression recognition,Computer science,Bootstrapping,Artificial intelligence,Robot,Human–robot interaction | Conference |
Volume | ISSN | ISBN |
2 | 2161-9476 | 978-1-61284-989-8 |
Citations | PageRank | References |
7 | 0.68 | 6 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Sofiane Boucenna | 1 | 138 | 11.16 |
Philippe Gaussier | 2 | 414 | 58.53 |
Laurence Hafemeister | 3 | 45 | 4.41 |