Title
Activity Recognition In Egocentric Life-Logging Videos
Abstract
With the increasing availability of wearable cameras, research on first-person view videos (egocentric videos) has received much attention recently. While some effort has been devoted to collecting various egocentric video datasets, there has not been a focused effort in assembling one that could capture the diversity and complexity of activities related to life-logging, which is expected to be an important application for egocentric videos. In this work, we first conduct a comprehensive survey of existing egocentric video datasets. We observe that existing datasets do not emphasize activities relevant to the life-logging scenario. We build an egocentric video dataset dubbed LENA (Life-logging EgoceNtric Activities) (http://people.sutd.edu.sg/similar to 1000892/dataset) which includes egocentric videos of 13 fine-grained activity categories, recorded under diverse situations and environments using the Google Glass. Activities in LENA can also be grouped into 5 top-level categories to meet various needs and multiple demands for activities analysis research. We evaluate state-of-the-art activity recognition using LENA in detail and also analyze the performance of popular descriptors in egocentric activity recognition.
Year
DOI
Venue
2015
10.1007/978-3-319-16634-6_33
COMPUTER VISION - ACCV 2014 WORKSHOPS, PT III
Field
DocType
Volume
Computer vision,Activity recognition,Fisher vector,Computer science,Wearable computer,Artificial intelligence,Optical flow,Mixture model
Conference
9010
ISSN
Citations 
PageRank 
0302-9743
8
0.51
References 
Authors
10
6
Name
Order
Citations
PageRank
Sibo Song1110.90
Vijay Chandrasekhar219122.83
Ngai-Man Cheung375067.36
Sanath Narayan480.51
Liyuan Li54813.24
Joo-Hwee Lim678382.45