Abstract | ||
---|---|---|
Human conversational partners usually try to interpret the speaker's or listener's affective cues and respond to them accordingly. Recently, the modelling and simulation of such behaviours has been recognized as an essential factor for more natural man-machine communication. The implicit emotion channels of human communication such as speech, facial expression, gesture, and physiological responses are used in general to extract emotion-relevant features for the computational perception of emotion. So far, research on emotion recognition has mostly dealt with offline analysis of recorded emotion corpora, and online processing (in realtime or near realtime) has hardly been addressed. Online processing is, however, a necessary prerequisite for the realization of human-computer interfaces that analyze and respond to the user's emotions while he or she is interacting with an application. In this paper, we first describe how we recognize emotions from various modalities including speech, gestures and biosignals. We then present Smart Sensor Integration (SSI), a framework which we developed to meet the specific requirements of online emotion recognition. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1524/itit.2009.0557 | IT-INFORMATION TECHNOLOGY |
Keywords | Field | DocType |
Information Systems: Information Interfaces and Presentation, Computing Methodologies: Artificial Intelligence, Emotion recognition, affective computing, multimodal fusion, multimodal user interfaces, speech recognition, physiological signal | Computer science,Emotion recognition,Sensory processing,Human–computer interaction,Affective computing | Journal |
Volume | Issue | ISSN |
51 | 6 | 1611-2776 |
Citations | PageRank | References |
0 | 0.34 | 3 |
Authors | ||
6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jonghwa Kim | 1 | 623 | 46.51 |
Johannes Wagner | 2 | 654 | 49.55 |
Thurid Vogt | 3 | 459 | 33.72 |
Elisabeth André | 4 | 3634 | 433.65 |
Frank Jung | 5 | 0 | 0.34 |
Matthias Rehm | 6 | 157 | 21.98 |