Title
Representing Affective Facial Expressions for Robots and Embodied Conversational Agents by Facial Landmarks.
Abstract
Affective robots and embodied conversational agents require convincing facial expressions to make them socially acceptable. To be able to virtually generate facial expressions, we need to investigate the relationship between technology and human perception of affective and social signals. Facial landmarks, the locations of the crucial parts of a face, are important for perception of the affective and social signals conveyed by facial expressions. Earlier research did not use that kind of technology, but rather used analogue technology to generate point-light faces. The goal of our study is to investigate whether digitally extracted facial landmarks contain sufficient information to enable the facial expressions to be recognized by humans. This study presented participants with facial expressions encoded in moving landmarks, while these facial landmarks correspond to the facial-landmark videos that were extracted by face analysis software from full-face videos of acted emotions. The facial-landmark videos were presented to 16 participants who were instructed to classify the sequences according to the emotion represented. Results revealed that for three out of five facial-landmark videos (happiness, sadness and anger), participants were able to recognize emotions accurately, but for the other two facial-landmark videos (fear and disgust), their recognition accuracy was below chance, suggesting that landmarks contain information about the expressed emotions. Results also show that emotions with high levels of arousal and valence are better recognized than those with low levels of arousal and valence. We argue that the question of whether these digitally extracted facial landmarks are a basis for representing facial expressions of emotions is crucial for the development of successful human-robot interaction in the future. We conclude by stating that landmarks provide a basis for the virtual generation of emotions in humanoid agents, and discuss how additional facial information might be included to provide a sufficient basis for faithful emotion identification.
Year
DOI
Venue
2013
10.1007/s12369-013-0208-9
I. J. Social Robotics
Keywords
Field
DocType
Robots,Embodied conversational agents,Emotion,Facial expression,Facial landmarks,FaceTracker
Social psychology,Sadness,Facial Action Coding System,Disgust,Psychology,Embodied cognition,Facial expression,Anger,Affect (psychology),Perception
Journal
Volume
Issue
ISSN
5
4
1875-4791
Citations 
PageRank 
References 
1
0.35
9
Authors
6
Name
Order
Citations
PageRank
Caixia Liu110.68
Jaap Ham228424.10
Eric O. Postma319527.10
Cees J. H. Midden421541.38
Bart Joosten572.35
Martijn Goudbeek67213.73