Title
Deep-Mapnets : A Residual Network For 3d Environment Representation
Abstract
The ability to localize in the co-ordinate system of a 3D model presents an opportunity for safe trajectory planning. While SLAM based approaches provide estimates of incremental poses with respect to the first camera frame, they do not provide global localization. With the availability of mobile GPUs like the Nvidia TX1 etc., our method provides a novel, elegant and high performance visual method for model based robot localization.We propose a method to learn an environment representation with deep residual nets for localization in a known 3D model representing a real-world area of 25,000 sq. meters. We use the power of modern GPUs and game engines for rendering training images mimicking a downward looking high flying drone using a photo realistic 3D model. We use these images to drive the learning loop of a 50-layer deep neural network to learn camera positions. We next propose to do data augmentation to accelerate training and to make our trained model robust for cross domain generalization, which has been verified with experiments. We test our trained model with synthetically generated data as well as real data captured from a downward looking drone. It takes about 25 miliseconds of GPU processing to predict camera pose. Unlike previous methods, the proposed method does not do rendering at test time and does independent prediction from a learned environment representation.
Year
Venue
Keywords
2017
2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP)
3D model-based localization, residual network learning, rendered training data, pose regression
Field
DocType
ISSN
Computer vision,Residual,Data modeling,Global localization,Computer science,Artificial intelligence,Solid modeling,Drone,Rendering (computer graphics),Artificial neural network,Trajectory planning
Conference
1522-4880
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Manohar Kuse1122.57
Sunil Prasad Jaiswal2427.94
Shaojie Shen372054.75