Abstract | ||
---|---|---|
This paper presents the Dynamic Emotion Representation(DER), and demonstrates how an instance of this model can be integrated into a facial animation system. The DER model has been implemented to enable users to create their own emotion representation. Developers can select which emotions they include and how these interact. The instance of the DER model described in this paper is composed of three layers, each representing states changing over different time scales: behavior activations, emotions and moods. The design of this DER is discussed with reference to emotion theories and to the needs of a facial animation system. The DER is used in our Emotionally Expressive Facial Animation System (EE-FAS) to produce emotional expressions, to select facial signals corresponding to communicative functions in relation to the emotional state of the agent and also in relation to the comparison between the emotional state and the intended meanings expressed through communicative functions. |
Year | DOI | Venue |
---|---|---|
2006 | 10.1142/S0219843606000758 | INTERNATIONAL JOURNAL OF HUMANOID ROBOTICS |
Keywords | Field | DocType |
emotion representations, facial animation, communicative functions | Computer vision,Computer science,Human–computer interaction,Emotional expression,Computer facial animation,Artificial intelligence,Multimedia | Journal |
Volume | Issue | ISSN |
3 | 3 | 0219-8436 |
Citations | PageRank | References |
7 | 0.85 | 7 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Emmanuel Tanguy | 1 | 83 | 5.37 |
Philip J. Willis | 2 | 114 | 13.68 |
Joanna J. Bryson | 3 | 305 | 35.10 |