Title
Perturbed Projected Gradient Descent Converges To Approximate Second-Order Points For Bound Constrained Nonconvex Problems
Abstract
In this paper, a gradient-based method for bound constrained non-convex problems is proposed. By leveraging both projected gradient descent and perturbed gradient descent, the proposed algorithm, named perturbed projected gradient descent (PP-GD), converges to some approximate second-order stationary (SS2) points (which satisfy certain approximate second-order necessary conditions) with provable convergence rate guarantees. The proposed algorithm is suitable for a large-scale problem since it only uses the gradient information of the objective function. It also seamlessly incorporates variable constraints such as nonnegativity, which is commonly seen in many practical machine learning problems. We provide a concrete theoretical analysis showing that PP-GD is able to obtain approximate second-order solutions by extracting the negative curvature of the objective function around the strict saddle points. Numerical results demonstrate that PP-GD indeed converges faster compared to other first-order methods in the presence of strict saddle points.
Year
DOI
Venue
2019
10.1109/icassp.2019.8683241
2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)
Keywords
Field
DocType
Strict saddle points, projected gradient descent, convergence rate, second-order stationary (SS2) points
Applied mathematics,Mathematical optimization,Gradient descent,Saddle point,Computer science,Rate of convergence,Negative curvature
Conference
ISSN
Citations 
PageRank 
1520-6149
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Songtao Lu18419.52
Ziping Zhao23111.50
Kaibin Huang33155182.06
Mingyi Hong4153391.29