Title
The Loss Surface of Deep Linear Networks Viewed Through the Algebraic Geometry Lens
Abstract
By using the viewpoint of modern computational algebraic geometry, we explore properties of the optimization landscapes of deep linear neural network models. After providing clarification on the various definitions of “flat” minima, we show that the geometrically flat minima, which are merely artifacts of residual continuous symmetries of the deep linear networks, can be straightforwardly removed by a generalized <inline-formula><tex-math notation="LaTeX">$L_2$</tex-math></inline-formula> -regularization. Then, we establish upper bounds on the number of isolated stationary points of these networks with the help of algebraic geometry. Combining these upper bounds with a method in numerical algebraic geometry, we find <i>all</i> stationary points for modest depth and matrix size. We demonstrate that, in the presence of the non-zero regularization, deep linear networks can indeed possess local minima which are not global minima. Finally, we show that even though the number of stationary points increases as the number of neurons (regularization parameters) increases (decreases), higher index saddles are surprisingly rare.
Year
DOI
Venue
2018
10.1109/TPAMI.2021.3071289
IEEE Transactions on Pattern Analysis and Machine Intelligence
Keywords
Field
DocType
Deep linear network,global optimization,regularization,numerical algebraic geometry
Residual,Algebraic geometry,Mathematical optimization,Matrix (mathematics),Mathematical analysis,Maxima and minima,Stationary point,Regularization (mathematics),Artificial neural network,Mathematics,Homogeneous space
Journal
Volume
Issue
ISSN
44
9
0162-8828
Citations 
PageRank 
References 
0
0.34
29
Authors
4
Name
Order
Citations
PageRank
Dhagash Mehta1158.26
Tianran Chen294.17
Tingting Tang300.34
Jonathan D. Hauenstein426937.65