Abstract | ||
---|---|---|
Spoken dialog systems (SDS) integrated into human-machine interaction interfaces is becoming a standard technology. Current state-of-the-art SDS, usually, is not able to provide for the user a natural way of communication. Existing automated dialog systems do not dedicate enough attention to problems in the interaction related to affected user behavior. As a result, Automatic Speech Recognition (ASR) engines are not able to recognize affected speech and dialog strategy does not make use of the user's emotional state. This paper addresses some aspects of processing affected speech within natural human-machine interaction. First of all, we propose an affected speech adapted ASR engine. Second, we describe our methods of emotion recognition within speech and present our results of emotion classification within Interspeech 2009 Emotion Challenge. Third, we test affected speech adapted speech recognition models and introduce an approach to achieve emotion adaptive dialog management in human-machine interaction. |
Year | Venue | Keywords |
---|---|---|
2009 | INTERSPEECH 2009: 10TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2009, VOLS 1-5 | Emotion Recognition, Affected Speech Recognition, Emotion Challenge |
Field | DocType | Citations |
Dialog box,Spoken dialog systems,Speech analytics,Voice activity detection,Emotion recognition,Computer science,Emotion classification,Speech recognition,Natural language processing,Artificial intelligence,Dialog system,Human machine interaction | Conference | 12 |
PageRank | References | Authors |
0.60 | 10 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Bogdan Vlasenko | 1 | 235 | 12.72 |
Andreas Wendemuth | 2 | 451 | 41.74 |