Title | ||
---|---|---|
The committee machine: Computational to statistical gaps in learning a two-layers neural network. |
Abstract | ||
---|---|---|
Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks. In this contribution, we provide a rigorous justification of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We find that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it; strongly suggesting that no efficient algorithm exists for those cases, and unveiling a large computational gap. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1088/1742-5468/ab43d2 | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018) |
Keywords | DocType | Volume |
neural networks,polynomial time,neural network,neural network models,statistical physics,large set,neural network model | Conference | 31 |
Issue | ISSN | Citations |
12 | 1049-5258 | 2 |
PageRank | References | Authors |
0.37 | 0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Aubin, Benjamin | 1 | 2 | 2.06 |
Antoine Maillard | 2 | 2 | 2.40 |
Jean Barbier | 3 | 117 | 13.46 |
Florent Krzakala | 4 | 977 | 67.30 |
Nicolas Macris | 5 | 268 | 26.29 |
Lenka Zdeborová | 6 | 1190 | 78.62 |