Title
Emotional remapping of music to facial animation
Abstract
We propose a method to extract the emotional data from a piece of music and then use that data via a remapping algorithm to automatically animate an emotional 3D face sequence. The method is based on studies of the emotional aspect of music and our parametric-based behavioral head model for face animation. We address the issue of affective communication remapping in general, i.e. translation of affective content (eg. emotions, and mood) from one communication form to another. We report on the results of our MusicFace system, which use these techniques to automatically create emotional facial animations from multi-instrument polyphonic music scores in MIDI format and a remapping rule set.
Year
DOI
Venue
2006
10.1145/1183316.1183337
Sandbox@SIGGRAPH
Field
DocType
ISBN
Computer vision,Mood,Affective communication,Computer graphics (images),Computer science,MIDI,Animation,Computer facial animation,Artificial intelligence,Polyphony,Computer animation,Affect (psychology)
Conference
1-59593-386-7
Citations 
PageRank 
References 
9
0.92
7
Authors
2
Name
Order
Citations
PageRank
Steve Dipaola120437.28
Ali Arya211020.31