Title
Support Surface Prediction in Indoor Scenes
Abstract
In this paper, we present an approach to predict the extent and height of supporting surfaces such as tables, chairs, and cabinet tops from a single RGBD image. We define support surfaces to be horizontal, planar surfaces that can physically support objects and humans. Given a RGBD image, our goal is to localize the height and full extent of such surfaces in 3D space. To achieve this, we created a labeling tool and annotated 1449 images with rich, complete 3D scene models in NYU dataset. We extract ground truth from the annotated dataset and developed a pipeline for predicting floor space, walls, the height and full extent of support surfaces. Finally we match the predicted extent with annotated scenes in training scenes and transfer the the support surface configuration from training scenes. We evaluate the proposed approach in our dataset and demonstrate its effectiveness in understanding scenes in 3D space.
Year
DOI
Venue
2013
10.1109/ICCV.2013.266
ICCV
Keywords
Field
DocType
support surface prediction,annotated scene,annotated dataset,training scene,indoor scenes,support surface configuration,support surface,full extent,rgbd image,floor space,nyu dataset
Computer vision,Computer graphics (images),Computer science,Support surface,Planar,Ground truth,Artificial intelligence,Image parsing,TOPS
Conference
ISSN
Citations 
PageRank 
1550-5499
28
1.21
References 
Authors
14
2
Name
Order
Citations
PageRank
Ruiqi Guo156422.10
Derek Hoiem24998302.66