Title
Robust RGB-D visual odometry based on edges and points.
Abstract
Localization in unknown environments is a fundamental requirement for robots. Egomotion estimation based on visual information is a hot research topic. However, most visual odometry (VO) or visual Simultaneous Localization and Mapping (vSLAM) approaches assume static environments. To achieve robust and precise localization in dynamic environments, we propose a novel VO based on edges and points for RGB-D cameras. In contrast to dense motion segmentation, sparse edge alignment with distance transform (DT) errors is adopted to detect the states of image areas. Features in dynamic areas are ignored in egomotion estimation with reprojection errors. Meanwhile, static weights calculated by DT errors are added to pose estimation. Furthermore, local bundle adjustment is utilized to improve the consistencies of the local map and the camera localization. The proposed approach can be implemented in real time. Experiments are implemented on the challenging sequences of the TUM RGB-D dataset. The results demonstrate that the proposed robust VO achieves more accurate and more stable localization than the state-of-the-art robust VO or SLAM approaches in dynamic environments.
Year
DOI
Venue
2018
10.1016/j.robot.2018.06.009
Robotics and Autonomous Systems
Keywords
Field
DocType
Localization,Visual odometry,Dynamic environments,Edge alignment,Bundle adjustment
Computer vision,Visual odometry,Segmentation,Computer science,Bundle adjustment,Pose,Distance transform,RGB color model,Artificial intelligence,Simultaneous localization and mapping,Robot
Journal
Volume
ISSN
Citations 
107
0921-8890
0
PageRank 
References 
Authors
0.34
12
5
Name
Order
Citations
PageRank
Erliang Yao100.34
Hexin Zhang230.71
Hui Xu321229.73
Haitao Song402.03
Guoliang Zhang513.05