Title
Detecting Engagement In Egocentric Video
Abstract
In a wearable camera video, we see what the camera wearer sees. While this makes it easy to know roughly what he chose to look at, it does not immediately reveal when he was engaged with the environment. Specifically, at what moments did his focus linger, as he paused to gather more information about something he saw? Knowing this answer would benefit various applications in video summarization and augmented reality, yet prior work focuses solely on the "what" question (estimating saliency, gaze) without considering the "when" (engagement). We propose a learning-based approach that uses long-term egomotion cues to detect engagement, specifically in browsing scenarios where one frequently takes in new visual information (e.g., shopping, touring). We introduce a large, richly annotated dataset for ego-engagement that is the first of its kind. Our approach outperforms a wide array of existing methods. We show engagement can be detected well independent of both scene appearance and the camera wearer's identity.
Year
DOI
Venue
2016
10.1007/978-3-319-46454-1_28
COMPUTER VISION - ECCV 2016, PT V
Keywords
DocType
Volume
Ground Truth,Random Forest,Optical Flow,Inertial Sensor,Random Forest Classifier
Conference
9909
ISSN
Citations 
PageRank 
0302-9743
10
0.49
References 
Authors
28
2
Name
Order
Citations
PageRank
Yu-Chuan Su18714.90
Kristen Grauman26258326.34