Title
Real-time Omnidirectional Visual SLAM with Semi-Dense Mapping.
Abstract
The state of art Visual SLAM is going from sparse feature to semi-dense feature to provide more information for environment perception, whereas the semi-dense methods often suffer from inaccurate depth map estimation and are easy to become instable for some real-world scenarios. The paper proposes to extend the ORB-SLAM2 framework, which is a robust sparse feature SLAM system tracking camera motion with map maintenance and loop closure, by introducing the unified spherical camera model and the semi-dense depth map. The unified spherical camera model fits the omnidirectional camera well, therefore the proposed Visual SLAM system could handle fisheye cameras which are commonly installed on modern vehicles to provide larger perceiving region. In addition to the sparse corners features the proposed system also utilizes high gradient regions as semi-dense features, thereby providing rich environment information. The paper presents in detail how the unified spherical camera model and the semi-dense feature matching are fused with the original SLAM system. Both accuracies of camera tracking and estimated depth map of the proposed SLAM system are evaluated using real-world data and CG rendered data where the ground truth of the depth map is available.
Year
Venue
Field
2018
Intelligent Vehicles Symposium
Omnidirectional camera,Iterative reconstruction,Computer vision,Omnidirectional antenna,Computer science,Visualization,Bundle adjustment,Ground truth,Artificial intelligence,Depth map,Simultaneous localization and mapping
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Senbo Wang112.04
jiguang yue283.88
Yanchao Dong313511.28
Runjie Shen411.37
Xinyu Zhang52412.48