Title
Towards consistent reconstructions of indoor spaces based on 6D RGB-D odometry and KinectFusion.
Abstract
We focus on generating consistent reconstructions of indoor spaces from a freely moving handheld RGB-D sensor, with the aim of creating virtual models that can be used for measuring and remodeling. We propose a novel 6D RGB-D odometry approach that finds the relative camera pose between consecutive RGB-D frames by keypoint extraction and feature matching both on the RGB and depth image planes. Furthermore, we feed the estimated pose to the highly accurate KinectFusion algorithm, which uses a fast ICP (Iterative-Closest-Point) to fine-tune the frame-to-frame relative pose and fuse the Depth data into a global implicit surface. We evaluate our method on a publicly available RGB-D SLAM benchmark dataset by Sturm et al. The experimental results show that our proposed reconstruction method solely based on visual odometry and KinectFusion outperforms the state-of-the-art RGB-D SLAM system accuracy. Our algorithm outputs a ready-to-use polygon mesh (highly suitable for creating 3D virtual worlds) without any post-processing steps.
Year
DOI
Venue
2014
10.1109/IROS.2014.6942798
IROS
Keywords
DocType
ISSN
feature extraction,image colour analysis,image matching,image reconstruction,image sensors,pose estimation,3D virtual worlds,KinectFusion algorithm,RGB image planes,RGB-D odometry approach,depth image planes,fast ICP algorithm,feature matching,frame-to-frame relative pose,handheld RGB-D sensor,indoor space reconstruction,iterative-closest-point algorithm,keypoint extraction,ready-to-use polygon mesh,red-green-blue-depth odometry,virtual model
Conference
2153-0858
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
William Haiwei Dong1314.26
Nadia Figueroa2488.64
Abdulmotaleb El-Saddik32416248.48