Abstract | ||
---|---|---|
Recently, Li et al. (Comput. Optim. Appl. 26:131---147, 2004) proposed a regularized Newton method for convex minimization problems. The method retains local quadratic convergence property without requirement of the singularity of the Hessian. In this paper, we develop a truncated regularized Newton method and show its global convergence. We also establish a local quadratic convergence theorem for the truncated method under the same conditions as those in Li et al. (Comput. Optim. Appl. 26:131---147, 2004). At last, we test the proposed method through numerical experiments and compare its performance with the regularized Newton method. The results show that the truncated method outperforms the regularized Newton method. |
Year | DOI | Venue |
---|---|---|
2009 | 10.1007/s10589-007-9128-7 | Comp. Opt. and Appl. |
Keywords | Field | DocType |
Convex minimization,Regularized Newton method,Truncated conjugate gradient strategy | Mathematical optimization,Mathematical analysis,Hessian matrix,Singularity,Newton's method in optimization,Rate of convergence,Convex optimization,Mathematics,Steffensen's method,Secant method,Newton's method | Journal |
Volume | Issue | ISSN |
43 | 1 | 0926-6003 |
Citations | PageRank | References |
4 | 0.48 | 8 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ying-Jie Li | 1 | 4 | 0.48 |
Donghui Li | 2 | 380 | 32.40 |