Title
Self-calibrating active depth perception via motion parallax
Abstract
A hallmark of biological systems is their ability to self-calibrate sensory-motor loops during their development. Understanding the principles of such self-calibration will enable the design of robots with similar autonomous learning abilities. Here we consider the problem of active depth perception based on motion parallax. When an observer moves sideways while looking at an object with a single eye, the eye rotation necessary to keep the object at the center of gaze provides information about the object's distance. Based on the recently proposed active efficient coding (AEC) approach, we present a self-calibrating system which autonomously learns to represent image motion and perform compensatory eye rotations to keep the object fixated during side-to-side movements — thereby learning to actively estimate the object's distance. A neural network is used to provide a calibrated depth estimate. We evaluate the system's performance in simulation and in a hardware implementation.
Year
DOI
Venue
2016
10.1109/DEVLRN.2016.7846798
2016 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)
Keywords
Field
DocType
Autonomous Learning,Motion Parallax,Active Depth Perception,Self-Calibration,Active Efficient Coding
Computer vision,Parallax,Gaze,Computer science,Coding (social sciences),Artificial intelligence,Depth perception,Observer (quantum physics),Artificial neural network,Robot,Encoding (memory)
Conference
ISBN
Citations 
PageRank 
978-1-5090-5070-3
0
0.34
References 
Authors
8
5
Name
Order
Citations
PageRank
Tanapol Prucksakorn100.68
Sungmoon Jeong29915.05
Jochen Triesch369073.73
Hosun Lee4104.66
Nak Young Chong540356.29