Abstract | ||
---|---|---|
Analysis and experimental results obtained in [1] have revealed that many network training problems are ill-conditioned and may not be solved efficiently by the Gauss-Newton method. The Levenberg-Marquardt algorithm has been used successfully in solving nonlinear least squares problems, however only for reasonable size problems due to its significant computation and memory complexities within each iteration. In the present paper we develop a new algorithm in the form of a modified Gauss-Newton which on one hand takes advantage of the Jacobian rank deficiency to reduce computation and memory complexities, and on the other hand, still has similar features to the Levenberg-Marquardt algorithm with better convergence properties than first order methods. |
Year | DOI | Venue |
---|---|---|
1996 | 10.1007/3-540-61510-5_91 | ICANN |
Keywords | Field | DocType |
jacobian rank deficiency,improving neural network,first order,neural network,levenberg marquardt,nonlinear least squares | Convergence (routing),Mathematical optimization,Jacobian matrix and determinant,First order,Computer science,Fractionating column,Artificial intelligence,Non-linear least squares,Artificial neural network,Machine learning,Computation | Conference |
ISBN | Citations | PageRank |
3-540-61510-5 | 0 | 0.34 |
References | Authors | |
4 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Guian Zhou | 1 | 10 | 2.37 |
Jennie Si | 2 | 746 | 70.23 |