Abstract | ||
---|---|---|
Since emotions are expressed through a combination of verbal and non-verbal channels, a joint analysis of speech and gestures
is required to understand expressive human communication. To facilitate such investigations, this paper describes a new corpus
named the “interactive emotional dyadic motion capture database” (IEMOCAP), collected by the Speech Analysis and Interpretation
Laboratory (SAIL) at the University of Southern California (USC). This database was recorded from ten actors in dyadic sessions
with markers on the face, head, and hands, which provide detailed information about their facial expressions and hand movements
during scripted and spontaneous spoken communication scenarios. The actors performed selected emotional scripts and also improvised
hypothetical scenarios designed to elicit specific types of emotions (happiness, anger, sadness, frustration and neutral state).
The corpus contains approximately 12 h of data. The detailed motion capture information, the interactive setting to elicit
authentic emotions, and the size of the database make this corpus a valuable addition to the existing databases in the community
for the study and modeling of multimodal and expressive human communication. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1007/s10579-008-9076-6 | Language Resources and Evaluation |
Keywords | DocType | Volume |
Audio-visual database,Dyadic interaction,Emotion,Emotional assessment,Motion capture system | Journal | 42 |
Issue | ISSN | Citations |
4 | 1574-020X | 317 |
PageRank | References | Authors |
11.18 | 28 | 9 |
Name | Order | Citations | PageRank |
---|---|---|---|
Carlos Busso | 1 | 1616 | 93.04 |
Murtaza Bulut | 2 | 848 | 39.45 |
Chi-Chun Lee | 3 | 654 | 49.41 |
Abe Kazemzadeh | 4 | 957 | 52.95 |
Emily Mower | 5 | 1062 | 59.08 |
Samuel Kim | 6 | 317 | 11.18 |
Jeannette N. Chang | 7 | 317 | 11.18 |
Sungbok Lee | 8 | 1394 | 84.13 |
Narayanan Shrikanth | 9 | 5558 | 439.23 |