Title
Emotions and Messages in Simple Robot Gestures
Abstract
Understanding how people interpret robot gestures will aid design of effective social robots. We examine the generation and interpretation of gestures in a simple social robot capable of head and arm movement using two studies. In the first study, four participants created gestures with corresponding messages and emotions based on 12 different scenarios provided to them. The resulting gestures were then shown in the second study to 12 participants who judged which emotions and messages were being conveyed. Knowledge (present or absent) of the motivating scenario (context) for each gesture was manipulated as an experimental factor. Context was found to assist message understanding while providing only modest assistance to emotion recognition. While better than chance, both emotion (22%) and message understanding (40%) accuracies were relatively low. The results obtained are discussed in terms of implied guidelines for designing gestures for social robots.
Year
DOI
Venue
2009
10.1007/978-3-642-02577-8_36
HCI (2)
Keywords
Field
DocType
experimental factor,arm movement,corresponding message,emotion recognition,simple social robot,social robot,robot gesture,different scenario,simple robot gestures,effective social robot,message understanding,gestures,emotion,human robot interaction,social robots
Social robot,Emotion recognition,Computer science,Gesture,Human–computer interaction,Robot,Human–robot interaction
Conference
Volume
ISSN
Citations 
5611
0302-9743
5
PageRank 
References 
Authors
1.37
7
4
Name
Order
Citations
PageRank
Jamy Li115617.28
Mark Chignell21159153.58
Sachi Mizobuchi314811.78
Michiaki Yasumura426636.88