Title
Size-Independent Sample Complexity Of Neural Networks
Abstract
We study the sample complexity of learning neural networks by providing new bounds on their Rademacher complexity, assuming norm constraints on the parameter matrix of each layer. Compared to previous work, these complexity bounds have improved dependence on the network depth and, under some additional assumptions, are fully independent of the network size (both depth and width). These results are derived using some novel techniques, which may be of independent interest.
Year
DOI
Venue
2018
10.1093/imaiai/iaz007
INFORMATION AND INFERENCE-A JOURNAL OF THE IMA
Keywords
DocType
Volume
neural networks, deep learning, sample complexity, Rademacher complexity
Conference
9
Issue
ISSN
Citations 
2
2049-8764
26
PageRank 
References 
Authors
0.82
3
3
Name
Order
Citations
PageRank
Noah Golowich1392.69
Alexander Rakhlin273262.84
Ohad Shamir31627119.03