Title
Annotation and analysis of listener's engagement based on multi-modal behaviors.
Abstract
We address the annotation of engagement in the context of human-machine interaction. Engagement represents the level of how much a user is being interested in and willing to continue the current interaction. The conversational data used in the annotation work is a human-robot interaction corpus where a human subject talks with the android ERICA, which is remotely operated by another human subject. The annotation work was done by multiple third-party annotators, and the task was to detect the time point when the level of engagement becomes high. The annotation results indicate that there are agreements among the annotators although the numbers of annotated points are different among them. It is also found that the level of engagement is related to turn-taking behaviors. Furthermore, we conducted interviews with the annotators to reveal behaviors used to show a high level of engagement. The results suggest that laughing, backchannels and nodding are related to the level of engagement.
Year
DOI
Venue
2016
10.1145/3011263.3011271
MA3HMI@ICMI
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
6
5
Name
Order
Citations
PageRank
Koji Inoue137042.28
Divesh Lala25213.91
Shizuka Nakamura324.45
Katsuya Takanashi44913.40
Tatsuya Kawahara51352196.52