Title
IEMOCAP: interactive emotional dyadic motion capture database
Abstract
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures is required to understand expressive human communication. To facilitate such investigations, this paper describes a new corpus named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation Laboratory (SAIL) at the University of Southern California (USC). This database was recorded from ten actors in dyadic sessions with markers on the face, head, and hands, which provide detailed information about their facial expressions and hand movements during scripted and spontaneous spoken communication scenarios. The actors performed selected emotional scripts and also improvised hypothetical scenarios designed to elicit specific types of emotions (happiness, anger, sadness, frustration and neutral state). The corpus contains approximately 12 h of data. The detailed motion capture information, the interactive setting to elicit authentic emotions, and the size of the database make this corpus a valuable addition to the existing databases in the community for the study and modeling of multimodal and expressive human communication.
Year
DOI
Venue
2008
10.1007/s10579-008-9076-6
Language Resources and Evaluation
Keywords
DocType
Volume
Audio-visual database,Dyadic interaction,Emotion,Emotional assessment,Motion capture system
Journal
42
Issue
ISSN
Citations 
4
1574-020X
317
PageRank 
References 
Authors
11.18
28
9
Search Limit
100317
Name
Order
Citations
PageRank
Carlos Busso1161693.04
Murtaza Bulut284839.45
Chi-Chun Lee365449.41
Abe Kazemzadeh495752.95
Emily Mower5106259.08
Samuel Kim631711.18
Jeannette N. Chang731711.18
Sungbok Lee8139484.13
Narayanan Shrikanth95558439.23