Abstract | ||
---|---|---|
Newton’s method is an important and basic method for solving nonlinear, univariate and unconstrained optimization problems. In this study, a new line search technique based on Chebyshev polynomials is presented. The proposed method is adaptive where it determines a descent direction at each iteration and avoids convergence to a maximum point. Approximations to the first and the second derivatives of a function using high order pseudospectral differentiation matrices are derived. The efficiency of the new method is analyzed in terms of the most popular and widely used criterion in comparison with Newton’s method using seven test functions. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1016/j.amc.2008.08.013 | Applied Mathematics and Computation |
Keywords | Field | DocType |
Unconstrained optimization,Univariate optimization,Newton’s method,Test functions,Initial point,Spectral methods,Differentiation matrix,Chebyshev polynomials,Chebyshev points | Chebyshev polynomials,Chebyshev pseudospectral method,Mathematical optimization,Mathematical analysis,Chebyshev equation,Descent direction,Line search,Numerical analysis,Mathematics,Newton's method,Chebyshev iteration | Journal |
Volume | Issue | ISSN |
206 | 2 | 0096-3003 |
Citations | PageRank | References |
4 | 0.68 | 5 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
K.T. Elgindy | 1 | 11 | 1.21 |
Abdel-Rahman Hedar | 2 | 404 | 30.79 |