Title
Which Neural Net Architectures Give Rise To Exploding and Vanishing Gradients?
Abstract
We give a rigorous analysis of the statistical behavior of gradients in a randomly initialized fully connected network N with ReLU activations. Our results show that the empirical variance of the squares of the entries in the input-output Jacobian of N is exponential in a simple architecture-dependent constant beta, given by the sum of the reciprocals of the hidden layer widths. When beta is large, the gradients computed by N at initialization vary wildly. Our approach complements the mean field theory analysis of random networks. From this point of view, we rigorously compute finite width corrections to the statistics of gradients at the edge of chaos.
Year
Venue
Keywords
2018
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
edge of chaos,point of view,subjective logic
DocType
Volume
ISSN
Conference
31
1049-5258
Citations 
PageRank 
References 
6
0.45
15
Authors
1
Name
Order
Citations
PageRank
Boris Hanin1474.04