Title
Towards Designing Android Faces after Actual Humans
Abstract
Using their face as their prior affective interface, android robots and other agents embody emotional facial expressions, and convey messages on their identity, gender, age, race, and attractiveness. We are examining whether androids can convey emotionally relevant information via their static facial signals, just as humans do. Based on the fact that social information can be accurately identified from still images of nonexpressive unknown faces, a judgment paradigm was employed to discover, and compare the style of facial expressions of the Geminoid-DK android (modeled after an actual human) and its' Original (the actual human). The emotional judgments were achieved through an online survey with video-stimuli and questionnaires, following a forced-choice design. Analysis of the results indicated that the emotional judgments for the Geminoid-DK highly depend on the emotional judgments initially made for the Original, suggesting that androids inherit the same style of facial expression as their originals. Our findings support the case of designing android faces after specific actual persons who portray facial features that are familiar to the users, and also relevant to the notion of the robotic task, in order to increase the chance of sustaining a more emotional interaction.
Year
DOI
Venue
2015
10.1007/978-3-319-19728-9_9
AGENT AND MULTI-AGENT SYSTEMS: TECHNOLOGIES AND APPLICATIONS
Keywords
Field
DocType
Android robot,Facial expression,Emotion,Static signals,Social perception,Human-agent interaction
Computer science,Attractiveness,Human–computer interaction,Artificial intelligence,Social information,Social perception,Computer vision,Android (operating system),Android (robot),Facial expression,Robot,Affect (psychology),Machine learning
Conference
Volume
ISSN
Citations 
38
2190-3018
1
PageRank 
References 
Authors
0.35
7
2
Name
Order
Citations
PageRank
Evgenios Vlachos1133.45
Henrik Schärfe2499.30