Title
Online Affect Tracking with Multimodal Kalman Filters.
Abstract
Arousal and valence have been widely used to represent emotions dimensionally and measure them continuously in time. In this paper, we introduce a computational framework for tracking these affective dimensions from multimodal data as an entry to the Multimodal Affect Recognition Sub-Challenge of the 2016 Audio/Visual Emotion Challenge and Workshop (AVEC2016). We propose a linear dynamical system approach with a late fusion method that accounts for the dynamics of the affective state evolution (i.e., arousal or valence). To this end, single-modality predictions are modeled as observations in a Kalman filter formulation in order to continuously track each affective dimension. Leveraging the inter-correlations between arousal and valence, we use the predicted arousal as an additional feature to improve valence predictions. Furthermore, we propose a conditional framework to select Kalman filters of different modalities while tracking. This framework employs voicing probability and facial posture cues to detect the absence or presence of each input modality. Our multimodal fusion results on the development and the test set provide a statistically significant improvement over the baseline system from AVEC2016. The proposed approach can be potentially extended to other multimodal tasks with inter-correlated behavioral dimensions.
Year
DOI
Venue
2016
10.1145/2988257.2988259
AVEC@ACM Multimedia
Keywords
Field
DocType
Multimodal affective computing, arousal, valence, linear dynamical systems, Kalman filters
Modalities,Arousal,Computer vision,Linear dynamical system,State evolution,Computer science,Kalman filter,Artificial intelligence,Baseline system,Affect (psychology),Test set
Conference
Citations 
PageRank 
References 
1
0.34
9
Authors
6
Name
Order
Citations
PageRank
Krishna S.198.31
Rahul Gupta29218.86
M. D. Nasir3586.14
Brandon M. Booth444.22
Sungbok Lee5139484.13
Narayanan Shrikanth65558439.23