Abstract | ||
---|---|---|
In this paper we study the learnability of random deep networks both theoretically and experimentally. On the theoretical front, assuming the statistical query model, we show that the learnability of random deep networks with sign activation drops exponentially with their depths; under plausible conjectures, our results extend to ReLu and sigmoid activations. The core of the arguments is that even for highly correlated inputs, the outputs of deep random networks are near-orthogonal. On the experimental side, we find that the learnability of random networks drops sharply with depth even with the state-of-the-art training methods.
|
Year | DOI | Venue |
---|---|---|
2020 | 10.5555/3381089.3381113 | SODA '20: ACM-SIAM Symposium on Discrete Algorithms
Salt Lake City
Utah
January, 2020 |
Field | DocType | Citations |
Discrete mathematics,Computer science,Theoretical computer science,Learnability | Conference | 1 |
PageRank | References | Authors |
0.35 | 0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Abhimanyu Das | 1 | 314 | 22.43 |
Sreenivas Gollapudi | 2 | 1198 | 64.70 |
Ravi Kumar | 3 | 13932 | 1642.48 |
Rina Panigrahy | 4 | 3203 | 269.05 |