Title
Exploiting Visual-Spatial First-Person Co-Occurrence for Action-Object Detection without Labels.
Abstract
Many first-person vision tasks such as activity recognition or video summarization requires knowing, which objects the camera wearer is interacting with (i.e. action-objects). The standard way to obtain this information is via a manual annotation, which is costly and time consuming. Also, whereas for the third-person tasks such as object detection, the annotator can be anybody, action-object detection task requires the camera wearer to annotate the data because a third-person may not know what the camera wearer was thinking. Such a constraint makes it even more difficult to obtain first-person annotations. To address this problem, we propose a Visual-Spatial Network (VSN) that detects action-objects without using any first-person labels. We do so (1) by exploiting the visual-spatial co-occurrence in the first-person data and (2) by employing an alternating cross-pathway supervision between the visual and spatial pathways of our VSN. During training, we use a selected action-object prior location to initialize the pseudo action-object ground truth, which is then used to optimize both pathways in an alternating fashion. The predictions from the spatial pathway are used to update the pseudo ground truth for the visual pathway and vice versa, which allows both pathways to improve each other. We show our methodu0027s success on two different action-object datasets, where our method achieves similar or better results than the supervised methods. We also show that our method can be successfully used as pretraining for a supervised action-object detection task.
Year
Venue
Field
2016
arXiv: Computer Vision and Pattern Recognition
Object detection,Computer vision,Pattern recognition,Computer science,Co-occurrence,Artificial intelligence
DocType
Volume
Citations 
Journal
abs/1611.05335
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Gedas Bertasius116910.38
Yu, Stella X.287786.36
Jianbo Shi3102071031.66