Title
Who is Afraid of Big Bad Minima? Analysis of gradient-flow in spiked matrix-tensor models
Abstract
Gradient-based algorithms are effective for many machine learning tasks, but despite ample recent effort and some progress, it often remains unclear why they work in practice in optimising high-dimensional non-convex functions and why they find good minima instead of being trapped in spurious ones. Here we present a quantitative theory explaining this behaviour in a spiked matrix-tensor model. Our framework is based on the Kac-Rice analysis of stationary points and a closed-form analysis of gradient-flow originating from statistical physics. We show that there is a well defined region of parameters where the gradient-flow algorithm finds a good global minimum despite the presence of exponentially many spurious local minima. We show that this is achieved by surfing on saddles that have strong negative direction towards the global minima, a phenomenon that is connected to a BBP-type threshold in the Hessian describing the critical points of the landscapes.
Year
Venue
Keywords
2019
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)
stationary point,statistical physics,stationary points
Field
DocType
Volume
Applied mathematics,Mathematical optimization,Tensor,Matrix (mathematics),Computer science,Maxima and minima,Balanced flow
Conference
32
ISSN
Citations 
PageRank 
1049-5258
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Stefano Sarao Mannelli101.69
Giulio Biroli25611.79
Chiara Cammarota3102.89
Florent Krzakala497767.30
Lenka Zdeborová5119078.62