Abstract | ||
---|---|---|
This paper studies convergence properties of regularized Newton methods for minimizing a convex function whose Hessian matrix may be singular everywhere. We show that if the objective function is LC2, then the methods possess local quadratic convergence under a local error bound condition without the requirement of isolated nonsingular solutions. By using a backtracking line search, we globalize an inexact regularized Newton method. We show that the unit stepsize is accepted eventually. Limited numerical experiments are presented, which show the practical advantage of the method. |
Year | DOI | Venue |
---|---|---|
2004 | 10.1023/B:COAP.0000026881.96694.32 | Comp. Opt. and Appl. |
Keywords | Field | DocType |
minimization problem,regularized Newton methods,global convergence,quadratic convergence,unit step | Mathematical optimization,Mathematical analysis,Hessian matrix,Backtracking line search,Convex function,Rate of convergence,Invertible matrix,Convex optimization,Mathematics,Steffensen's method,Newton's method | Journal |
Volume | Issue | ISSN |
28 | 2 | 1573-2894 |
Citations | PageRank | References |
12 | 0.81 | 1 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Donghui Li | 1 | 380 | 32.40 |
Masao Fukushima | 2 | 2050 | 172.73 |
Liqun Qi | 3 | 3155 | 284.52 |
Nobuo Yamashita | 4 | 53 | 7.91 |