Title
Saliency selection for robust visual tracking
Abstract
This paper proposes a robust visual tracking approach based on saliency selection. In this method, salient patches and their spatial context inside the object region are exploited for object representation and appearance modeling. Tracking is then implemented by a hybrid stochastic and deterministic mechanism, which needs a small number of samples for particle filtering and escapes local minimum in conventional deterministic tracking. As time progresses, the selected salient patches and their spatial context are updated online to adapt the appearance model to both object and environmental changes. We carry out experiments on several challenging sequences and compare our method with the state-of-the-art algorithm to show its improvement in terms of tracking performance.
Year
DOI
Venue
2010
10.1109/ICIP.2010.5651016
ICIP
Keywords
Field
DocType
particle filtering (numerical methods),image representation,conventional deterministic tracking,salient patches,adaptive appearance modeling,particle filtering,saliency selection,object region,robust visual tracking,object tracking,appearance modeling,hybrid of stochastic and deterministic tracking,object representation,stochastic processes,visualization,spatial context,histograms,particle filter,environmental change,computational modeling,visual tracking
Computer vision,Histogram,Pattern recognition,Salience (neuroscience),Computer science,Particle filter,Active appearance model,Video tracking,Eye tracking,Artificial intelligence,Spatial contextual awareness,Salient
Conference
ISSN
ISBN
Citations 
1522-4880 E-ISBN : 978-1-4244-7993-1
978-1-4244-7993-1
2
PageRank 
References 
Authors
0.37
6
3
Name
Order
Citations
PageRank
Qing Wang12399.21
Feng Chen243133.92
Wenli Xu3132763.69