Title
An efficient gradient method using the Yuan steplength
Abstract
We propose a new gradient method for quadratic programming, named SDC, which alternates some steepest descent (SD) iterates with some gradient iterates that use a constant steplength computed through the Yuan formula. The SDC method exploits the asymptotic spectral behaviour of the Yuan steplength to foster a selective elimination of the components of the gradient along the eigenvectors of the Hessian matrix, i.e., to push the search in subspaces of smaller and smaller dimensions. The new method has global and $$R$$ R -linear convergence. Furthermore, numerical experiments show that it tends to outperform the Dai---Yuan method, which is one of the fastest methods among the gradient ones. In particular, SDC appears superior as the Hessian condition number and the accuracy requirement increase. Finally, if the number of consecutive SD iterates is not too small, the SDC method shows a monotonic behaviour.
Year
DOI
Venue
2014
10.1007/s10589-014-9669-5
Computational Optimization and Applications
Keywords
Field
DocType
Gradient methods,Yuan steplength,Quadratic programming
Gradient method,Monotonic function,Gradient descent,Mathematical optimization,Condition number,Mathematical analysis,Hessian matrix,Rate of convergence,Quadratic programming,Eigenvalues and eigenvectors,Mathematics
Journal
Volume
Issue
ISSN
59
3
0926-6003
Citations 
PageRank 
References 
17
1.00
9
Authors
5
Name
Order
Citations
PageRank
Roberta De Asmundis1242.17
d di serafino216519.51
William W. Hager31603214.67
Gerardo Toraldo4171.68
Hongchao Zhang580943.29