Title
Exploiting spatial-temporal context for trajectory based action video retrieval.
Abstract
Retrieving videos with similar actions is an important task with many applications. Yet it is very challenging due to large variations across different videos. While the state-of-the-art approaches generally utilize the bag-of-visual-words representation with the dense trajectory feature, the spatial-temporal context among trajectories is overlooked. In this paper, we propose to incorporate such information into the descriptor coding and trajectory matching stages of the retrieval pipeline. Specifically, to capture the spatial-temporal correlations among trajectories, we develop a descriptor coding method based on the correlation between spatial-temporal and feature aspects of individual trajectories. To deal with the mis-alignments between dense trajectory segments, we develop an offset-aware distance measure for improved trajectory matching. Our comprehensive experimental results on two popular datasets indicate that the proposed method improves the performance of action video retrieval, especially on more dynamic actions with significant movements and cluttered backgrounds.
Year
DOI
Venue
2018
10.1007/s11042-017-4353-2
Multimedia Tools Appl.
Keywords
Field
DocType
Spatial-temporal information, Descriptor coding, Trajectory matching, Bag-of-visual-words, Action video retrieval
Computer vision,Bag-of-words model in computer vision,Pattern recognition,Video retrieval,Computer science,Coding (social sciences),Correlation,Artificial intelligence,Temporal context,Trajectory
Journal
Volume
Issue
ISSN
77
2
1573-7721
Citations 
PageRank 
References 
1
0.34
49
Authors
6
Name
Order
Citations
PageRank
Lelin Zhang1233.51
Zhiyong Wang255051.76
Tingting Yao321.03
Shin'ichi Staoh410.34
Tao Mei54702288.54
David Dagan Feng63329413.76