Title
Future Localization from an Egocentric Depth Image
Abstract
This paper presents a method for future localization: to predict a set of plausible trajectories of ego-motion given a depth image. We predict paths avoiding obstacles, between objects, even paths turning around a corner into space behind objects. As a byproduct of the predicted trajectories of ego-motion, we discover in the image the empty space occluded by foreground objects. We use no image based features such as semantic labeling/segmentation or object detection/recognition for this algorithm. Inspired by proxemics, we represent the space around a person using an EgoSpace map, akin to an illustrated tourist map, that measures a likelihood of occlusion at the egocentric coordinate system. A future trajectory of ego-motion is modeled by a linear combination of compact trajectory bases allowing us to constrain the predicted trajectory. We learn the relationship between the EgoSpace map and trajectory from the EgoMotion dataset providing in-situ measurements of the future trajectory. A cost function that takes into account partial occlusion due to foreground objects is minimized to predict a trajectory. This cost function generates a trajectory that passes through the occluded space, which allows us to discover the empty space behind the foreground objects. We quantitatively evaluate our method to show predictive validity and apply to various real world scenes including walking, shopping, and social interactions.
Year
Venue
Field
2015
CoRR
Coordinate system,Linear combination,Computer vision,Object detection,Pattern recognition,Computer science,Segmentation,Proxemics,Image based,Semantic labeling,Artificial intelligence,Trajectory
DocType
Volume
Citations 
Journal
abs/1509.02094
0
PageRank 
References 
Authors
0.34
16
3
Name
Order
Citations
PageRank
Hyun Soo Park117214.24
yedong niu200.34
Jianbo Shi3102071031.66