Title
Globally Spatial-Temporal Perception: A Long-Term Tracking System
Abstract
Although siamese trackers have achieved superior performance, these kinds of approaches tend to favour the local search mechanism and are thus prone to accumulating inaccuracies of predicted positions, leading to tracking drift over time, especially in long-term tracking scenario. To solve these problems, we propose a siamese tracker in the spirit of the faster RCNN's two-stage detection paradigm. This new tracker is dedicated to reducing cumulative inaccuracies and improving robustness based on a global perception mechanism, which allows the target to be retrieved in time spatially over the whole image plane. Since the very deep network can be enabled for feature learning in this two-stage tracking framework, the power of discrimination is guaranteed. What's more, we also add a CNN-based trajectory prediction module exploiting the target's temporal motion information to mitigate the interference of distractors. These two spatial and temporal modules exploit both the high-level appearance information and complementary trajectory information to improve the tracking robustness. Comprehensive experiments demonstrate that the proposed Globally Spatial-Temporal Perception-based tracking system performs favorably against state-of-the-art trackers.
Year
DOI
Venue
2020
10.1109/ICIP40778.2020.9191319
2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)
Keywords
DocType
ISSN
Visual object tracking, siamese network, motion model
Conference
1522-4880
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Zhenbang Li101.35
Qiang Wang243666.63
Jin Gao328014.51
Bing Li421760.28
Weiming Hu55300261.38