Title
The committee machine: Computational to statistical gaps in learning a two-layers neural network.
Abstract
Heuristic tools from statistical physics have been used in the past to locate the phase transitions and compute the optimal learning and generalization errors in the teacher-student scenario in multi-layer neural networks. In this contribution, we provide a rigorous justification of these approaches for a two-layers neural network model called the committee machine. We also introduce a version of the approximate message passing (AMP) algorithm for the committee machine that allows to perform optimal learning in polynomial time for a large set of parameters. We find that there are regimes in which a low generalization error is information-theoretically achievable while the AMP algorithm fails to deliver it; strongly suggesting that no efficient algorithm exists for those cases, and unveiling a large computational gap.
Year
DOI
Venue
2018
10.1088/1742-5468/ab43d2
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018)
Keywords
DocType
Volume
neural networks,polynomial time,neural network,neural network models,statistical physics,large set,neural network model
Conference
31
Issue
ISSN
Citations 
12
1049-5258
2
PageRank 
References 
Authors
0.37
0
6
Name
Order
Citations
PageRank
Aubin, Benjamin122.06
Antoine Maillard222.40
Jean Barbier311713.46
Florent Krzakala497767.30
Nicolas Macris526826.29
Lenka Zdeborová6119078.62