Abstract | ||
---|---|---|
Human-to-computer interaction in a variety of applications could benefit if systems could accurately analyze and respond to their users' affect. Although a great deal of research has been conducted on affect recognition, very little of this work has considered what is the appropriate information to extract in specific situations. Towards understanding how specific applications such as affective tutoring and affective entertainment could benefit, we present two experiments. In the first experiment, we found that students' facial expressions, together with their body actions, gave little information about their internal emotion per se but they would be useful features for predicting their self-reported "true" mental state. In the second experiment, we found significant differences between the facial expressions and self-reported affective state of viewers watching a movie sequence. Our results suggest that the noisy relationship between observable gestures and underlying affect must be accounted for when designing affective computing applications. |
Year | DOI | Venue |
---|---|---|
2007 | 10.1007/978-3-540-74889-2_40 | ACII |
Keywords | Field | DocType |
affective computing application,appropriate information,underlying affect,self-reported affective state,towards knowledge-based affective interaction,affective entertainment,situational interpretation,specific application,mental state,facial expression,affect recognition,affective tutoring,affective computing,knowledge base | Social psychology,Communication,Gesture,Psychology,Facial expression,Affective science,Situational ethics,Affective computing,Affect (psychology),Affect control theory,Mental state | Conference |
Volume | ISSN | Citations |
4738 | 0302-9743 | 6 |
PageRank | References | Authors |
0.57 | 9 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Abdul Rehman Abbasi | 1 | 46 | 3.33 |
Takeaki Uno | 2 | 1319 | 107.99 |
Matthew N. Dailey | 3 | 331 | 26.44 |
Nitin V. Afzulpurkar | 4 | 50 | 5.44 |