Title | ||
---|---|---|
Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization |
Abstract | ||
---|---|---|
In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our new families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under any line search technique which guarantees the positiveness of ${y_{k}^{T}} s_{k}$. For uniformly convex functions, we indicate that our families are globally convergent under weak-Wolfe-Powell line search technique and standard conditions on the objective function. We also establish a weaker global convergence theorem for general smooth functions under similar assumptions. Our numerical experiments for 260 standard problems and seven other recently developed conjugate gradient methods illustrate that the members of our families are numerically efficient and effective. |
Year | DOI | Venue |
---|---|---|
2020 | 10.1007/s11075-019-00709-7 | Numerical Algorithms |
Keywords | Field | DocType |
Large-scale optimization problem, Nonconvex optimization, Conjugate gradient method, Weak-Wolfe-Powell line search technique, Global convergence, 90C30, 90C06, 90C26 | Convergence (routing),Conjugate gradient method,Applied mathematics,Mathematical analysis,Conjugacy class,Convex function,Line search,Optimization problem,Mathematics | Journal |
Volume | Issue | ISSN |
83 | 3 | 1017-1398 |
Citations | PageRank | References |
1 | 0.35 | 0 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
S. Bojari | 1 | 1 | 0.69 |
M. R. Eslahchi | 2 | 88 | 13.65 |