Title
When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry
Abstract
People use imitation to encourage each other during conversation. We have conducted an experiment to investigate how imitation by a robot affect people’s perceptions of their conversation with it. The robot operated in one of three ways: full head gesture mimicking, partial head gesture mimicking (nodding), and non-mimicking (blinking). Participants rated how satisfied they were with the interaction. We hypothesized that participants in the full head gesture condition will rate their interaction the most positively, followed by the partial and non-mimicking conditions. We also performed gesture analysis to see if any differences existed between groups, and did find that men made significantly more gestures than women while interacting with the robot. Finally, we interviewed participants to try to ascertain additional insight into their feelings of rapport with the robot, which revealed a number of valuable insights.
Year
DOI
Venue
2010
10.1007/s12193-009-0028-2
Journal on Multimodal User Interfaces
Keywords
Field
DocType
Affective computing,Empathy,Facial expressions,Human-robot interaction,Social robotics
Social robot,Conversation,Gesture,Computer science,Human–computer interaction,Facial expression,Imitation,Affective computing,Robot,Human–robot interaction
Journal
Volume
Issue
ISSN
3
1
1783-7677
Citations 
PageRank 
References 
49
2.44
9
Authors
3
Name
Order
Citations
PageRank
Laurel D. Riek121525.17
Philip C. Paul2492.44
Peter Robinson31438129.42