Title
Motion-based Object Segmentation based on Dense RGB-D Scene Flow.
Abstract
Given two consecutive RGB-D images, we propose a model that estimates a dense three-dimensional (3D) motion field, also known as scene flow . We take advantage of the fact that in robot manipulation scenarios, scenes often consist of a set of rigidly moving objects. Our model jointly estimates the following: First, the segmentation of the scene into an unknown but finite number of objects, second, the motion trajectories of these objects, and finally, the object scene flow. We employ an hourglass, deep neural network architecture. In the encoding stage, the RGB and depth images undergo spatial compression and correlation. In the decoding stage, the model outputs three images containing a per-pixel estimate of the corresponding object center as well as object translation and rotation. This forms the basis for inferring the object segmentation and final object scene flow. To evaluate our model, we generated a new and challenging, large scale, synthetic dataset that is specifically targeted at robotic manipulation: It contains a large number of scenes with a very diverse set of simultaneously moving 3D objects and is recorded with a simulated, static RGB-D camera. In quantitative experiments, we show that we outperform state-of-the-art scene flow and motion-segmentation methods on this data set. In qualitative experiments, we show how our learned model transfers to challenging real-world scenes, visually generating better results than existing methods.
Year
DOI
Venue
2018
10.1109/lra.2018.2856525
international conference on robotics and automation
DocType
Volume
Issue
Journal
abs/1804.05195
4
Citations 
PageRank 
References 
2
0.37
16
Authors
4
Name
Order
Citations
PageRank
Lin Shao1123.33
Parth B. Shah220.37
Vikranth Reddy Dwaracherla362.48
Jeannette Bohg427530.60