Title
A Provably Correct Algorithm for Deep Learning that Actually Works.
Abstract
We describe a layer-by-layer algorithm for training deep convolutional networks, where each step involves gradient updates for a two layer network followed by a simple clustering algorithm. Our algorithm stems from a deep generative model that generates mages level by level, where lower resolution images correspond to latent semantic classes. We analyze the convergence rate of our algorithm assuming that the data is indeed generated according to this model (as well as additional assumptions). While we do not pretend to claim that the assumptions are realistic for natural images, we do believe that they capture some true properties of real data. Furthermore, we show that our algorithm actually works in practice (on the CIFAR dataset), achieving results in the same ballpark as that of vanilla convolutional neural networks that are being trained by stochastic gradient descent. Finally, our proof techniques may be of independent interest.
Year
Venue
Field
2018
arXiv: Learning
Stochastic gradient descent,Convolutional neural network,Algorithm,Rate of convergence,Artificial intelligence,Deep learning,Cluster analysis,Mathematics,Generative model
DocType
Volume
Citations 
Journal
abs/1803.09522
4
PageRank 
References 
Authors
0.39
11
2
Name
Order
Citations
PageRank
Malach, Eran1525.60
Shai Shalev-Shwartz23681276.32