Title
Unsupervised Event-Based Learning Of Optical Flow, Depth, And Egomotion
Abstract
In this work, we propose a novel framework for unsupervised learning for event cameras that learns motion information from only the event stream. In particular, we propose an input representation of the events in the form of a discretized volume that maintains the temporal distribution of the events, which we pass through a neural network to predict the motion of the events. This motion is used to attempt to remove any motion blur in the event image. We then propose a loss function applied to the motion compensated event image that measures the motion blur in this image. We train two networks with this framework, one to predict optical flow, and one to predict egomotion and depths, and evaluate these networks on the Multi Vehicle Stereo Event Camera dataset, along with qualitative results from a variety of different scenes.
Year
DOI
Venue
2018
10.1109/CVPR.2019.00108
2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019)
Field
DocType
Volume
Computer vision,Pattern recognition,Computer science,Artificial intelligence,Optical flow
Journal
abs/1812.08156
ISSN
Citations 
PageRank 
1063-6919
14
0.49
References 
Authors
0
4
Name
Order
Citations
PageRank
Alex Zihao Zhu1544.75
Liangzhe Yuan2191.96
Kenneth P. Chaney3162.25
Konstantinos Daniilidis43122255.45