Title
Modeling Mutual Influence Of Interlocutor Emotion States In Dyadic Spoken Interactions
Abstract
In dyadic human interactions, mutual influence - a person's influence on the interacting partner's behaviors - is shown to be important and could be incorporated into the modeling framework in characterizing, and automatically recognizing the participants' states. We propose a Dynamic Bayesian Network (DBN) to explicitly model the conditional dependency between two interacting partners' emotion states in a dialog using data from the IEMOCAP corpus of expressive dyadic spoken interactions. Also, we focus on automatically computing the Valence-Activation emotion attributes to obtain a continuous characterization of the participants' emotion flow. Our proposed DBN models the temporal dynamics of the emotion states as well as the mutual influence between speakers in a dialog. With speech based features, the proposed network improves classification accuracy by 3.67% absolute and 7.12% relative over the Gaussian Mixture Model (GMM) baseline on isolated turn-by-turn emotion classification.
Year
Venue
Keywords
2009
INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5
emotion recognition, mutual influence, Dynamic Bayesian Network, dyadic interaction
Field
DocType
Citations 
Dialog box,Pattern recognition,Emotion recognition,Computer science,Emotion classification,Speech recognition,Artificial intelligence,Dyadic interaction,Mixture model,Dynamic Bayesian network
Conference
31
PageRank 
References 
Authors
1.30
7
4
Name
Order
Citations
PageRank
Chi-Chun Lee165449.41
Carlos Busso2161693.04
Sungbok Lee3139484.13
Narayanan Shrikanth45558439.23