Title
Gaussian Mixture Models For Temporal Depth Fusion
Abstract
Sensing the 3D environment of a moving robot is essential for collision avoidance. Most 3D sensors produce dense depth maps, which are subject to imperfections due to various environmental factors. Temporal fusion of depth maps is crucial to overcome those. Temporal fusion is traditionally done in 3D space with voxel data structures, but it can be approached by temporal fusion in image space, with potential benefits in reduced memory and computational cost for applications like reactive collision avoidance for micro air vehicles. In this paper, we present an efficient Gaussian Mixture Models based depth map fusion approach, introducing an online update scheme for dense representations. The environment is modeled from an ego-centric point of view, where each pixel is represented by a mixture of Gaussian inverse-depth models. Consecutive frames are related to each other by transformations obtained from visual odometry. This approach achieves better accuracy than alternative image space depth map fusion techniques at lower computational cost.
Year
DOI
Venue
2017
10.1109/WACV.2017.104
2017 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2017)
Field
DocType
ISSN
Computer vision,Visual odometry,Computer science,Sensor fusion,Gaussian,Artificial intelligence,Pixel,Solid modeling,Depth map,Simultaneous localization and mapping,Mixture model
Conference
2472-6737
Citations 
PageRank 
References 
2
0.38
29
Authors
3
Name
Order
Citations
PageRank
Cevahir Çigla131.07
Roland Brockers2779.62
Larry H. Matthies395879.64