Title
An adaptive localization system for image storage and localization latency requirements.
Abstract
Fast and efficient global localization is a critical problem for autonomous systems. Existing sequence-based visual place recognition requires a storage-intensive image database for robust localization, while more storage-efficient odometry-based place recognition approaches can require a long travel distance to obtain an accurate localization. In this paper, we present a novel particle filter-based localization system that adapts to varying degrees of map image densities, road layout ambiguity and visual appearance change. The base system combines a geometric place recognition capability utilizing odometry and roadmaps with a visual place recognition system. When using a sparse image database, particles could exist at visually unknown places, which introduces difficulties in performing sequential visual place recognition. To address this challenge, we propose to make use of effective visual observations to enable the system to accumulate visual belief sequentially, even when reference images are very sparse. Furthermore, we develop a vision reliability estimation method, which analyses the relationship between the visual component and the particle filter convergence, to calibrate the optimal contribution of vision to particle weighting in different visual environments and conditions. To evaluate our approach, we perform extensive experiments using four benchmark localization datasets, and control the reference image density by subsampling these datasets. Results show that the proposed technique is able to consistently and correctly localize the vehicle over a range of reference image densities, and to consistently outperform a particle filter-enhanced version of an existing state-of-the-art SeqSLAM system, which fails when image spacing exceeds 30 m. In particular, for a 600% increase in database image sparsity (from 10 m to 70 m), we show that the proposed method is able to maintain localization performance with only a 40% increase in localization latency (from 250 m to 350 m). We also provide an analysis of the results and a characterization of the system’s computational requirements.
Year
DOI
Venue
2018
10.1016/j.robot.2018.06.007
Robotics and Autonomous Systems
Keywords
Field
DocType
Visual place recognition,Map-based localization,Odometry,Road map,Autonomous vehicles
Convergence (routing),Computer vision,Weighting,Latency (engineering),Computer science,Particle filter,Odometry,Sparse image,Artificial intelligence,Autonomous system (Internet),Visual appearance
Journal
Volume
ISSN
Citations 
107
0921-8890
0
PageRank 
References 
Authors
0.34
14
3
Name
Order
Citations
PageRank
Jun Mao1112.56
Xiaoping Hu240859.63
Michael Milford3122184.09