Abstract | ||
---|---|---|
In this paper, we consider the problem of jointly tracking the pose and shape of objects based on noisy data from cameras and depth sensors. Our proposed approach formalizes object silhouettes from image data as measurements within a Bayesian estimation framework. Projecting object silhouettes from images back into space yields a visual hull that constrains the object. In this work, we focus on the 2D case. We derive a general equation for the silhouette measurement update that explicitly considers segmentation uncertainty of each pixel. By assuming a bounded error for the silhouettes, we can reduce the complexity of the general solution to only consider uncertain edges and derive an approximate measurement update. In simulations, we show that the proposed approach dramatically improves point-cloud-based estimators, especially in the presence of high noise. |
Year | Venue | Keywords |
---|---|---|
2013 | Fusion | point clouds,depth sensors,bayesian object tracking,silhouette measurement update,silhouette measurements,image segmentation,image denoising,pose estimation,silhouettes,bayesian estimation framework,visual hull,image sensors,shape and pose estimation,object tracking,cameras,noisy point clouds,object pose,pixel segmentation uncertainty,object silhouettes,point cloud-based estimators,extended object tracking,object shape,shape,noise measurement,sensors,uncertainty |
Field | DocType | ISBN |
Computer vision,Visual hull,Computer science,Silhouette,3D pose estimation,Image segmentation,Pose,Video tracking,Artificial intelligence,Pixel,Point cloud,Machine learning | Conference | 978-605-86311-1-3 |
Citations | PageRank | References |
0 | 0.34 | 14 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Florian Faion | 1 | 74 | 7.95 |
Marcus Baum | 2 | 285 | 32.99 |
Uwe D. Hanebeck | 3 | 944 | 133.52 |