Title | ||
---|---|---|
Dynamical mean-field theory for stochastic gradient descent in Gaussian mixture classification* |
Abstract | ||
---|---|---|
We analyze in a closed form the learning dynamics of the stochastic gradient descent (SGD) for a single-layer neural network classifying a high-dimensional Gaussian mixture where each cluster is assigned one of two labels. This problem provides a prototype of a non-convex loss landscape with interpolating regimes and a large generalization gap. We define a particular stochastic process for which SGD can be extended to a continuous-time limit that we call stochastic gradient flow. In the full-batch limit, we recover the standard gradient flow. We apply dynamical mean-field theory from statistical physics to track the dynamics of the algorithm in the high-dimensional limit via a self-consistent stochastic process. We explore the performance of the algorithm as a function of the control parameters shedding light on how it navigates the loss landscape. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1088/1742-5468/ac3a80 | JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT |
Keywords | DocType | Volume |
learning theory, machine learning | Conference | 2021 |
Issue | ISSN | Citations |
12 | 1742-5468 | 0 |
PageRank | References | Authors |
0.34 | 0 | 4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Francesca Mignacco | 1 | 0 | 0.68 |
Florent Krzakala | 2 | 977 | 67.30 |
Pierfrancesco Urbani | 3 | 1 | 2.72 |
Lenka Zdeborová | 4 | 1190 | 78.62 |