Abstract | ||
---|---|---|
Several studies report successful results on how social assistive robots can be employed as interface in the assisted living domain. In our opinion, to plan their response and interact successfully with people, it is crucial to recognize human emotions. To this aim, features of the prosody of the speech together with facial expressions and gestures may be used to recognize the emotional state of the user. The information gained from these different sources may be fused in order to endow the robot with the capability to reason on the user's affective state. In this paper we describe how this capability has been implemented in the NAO robot and how this allows simulating empathic behaviors in the context of Ambient Assisted Living. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1145/2808435.2808445 | CHItaly |
Field | DocType | Citations |
Prosody,Nao robot,Communication,Gesture,Psychology,Human–computer interaction,Facial expression,Affective computing,Affect (psychology),Robot | Conference | 0 |
PageRank | References | Authors |
0.34 | 19 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
De Carolis | 1 | 433 | 65.33 |
Stefano Ferilli | 2 | 722 | 101.11 |
Giuseppe Palestra | 3 | 1 | 4.09 |
Valeria Carofiglio | 4 | 235 | 25.03 |