Title
Second-Order Guarantees of Stochastic Gradient Descent in Nonconvex Optimization
Abstract
Recent years have seen increased interest in performance guarantees of gradient descent algorithms for nonconvex optimization. A number of works have uncovered that gradient noise plays a critical role in the ability of gradient descent recursions to efficiently escape saddle-points and reach second-order stationary points. Most available works limit the gradient noise component to be bounded with probability one or sub-Gaussian and leverage concentration inequalities to arrive at high-probability results. We present an alternate approach, relying primarily on mean-square arguments and show that a more relaxed relative bound on the gradient noise variance is sufficient to ensure efficient escape from saddle points without the need to inject additional noise, employ alternating step sizes, or rely on a global dispersive noise assumption, as long as a gradient noise component is present in a descent direction for every saddle point.
Year
DOI
Venue
2022
10.1109/TAC.2021.3131963
IEEE Transactions on Automatic Control
Keywords
DocType
Volume
Adaptation,gradient noise,nonconvex cost,stationary points,stochastic optimization
Journal
67
Issue
ISSN
Citations 
12
0018-9286
0
PageRank 
References 
Authors
0.34
0
2
Name
Order
Citations
PageRank
Stefan Vlaski12311.39
Ali H. Sayed29134667.71