Abstract | ||
---|---|---|
The backpropagation algorithm is essentially a steepest gradient descent type of optimization routine minimizing a quadratic performance index at each step. The backpropagation algorithm is re-cast in the framework of generalized least squares. The main advantage is that it eliminates the need to predict an optimal value for the step size required in the standard backpropagation algorithm. A simulation result on the approximation of a nonlinear dynamical system is presented to show its rapid rate of convergence compared to the backpropagation algorithm |
Year | DOI | Venue |
---|---|---|
1993 | 10.1109/ICNN.1993.298624 | San Francisco, CA |
Keywords | Field | DocType |
backpropagation,convergence of numerical methods,least squares approximations,neural nets,nonlinear systems,optimisation,performance index,backpropagation algorithm,convergence rate,generalized least squares,nonlinear dynamical system,optimization,quadratic performance index,steepest gradient descent type,power generation,least squares approximation,neural networks,gradient descent,rate of convergence,convergence,artificial neural networks | Least squares,Gradient descent,Mathematical optimization,Stochastic gradient descent,Algorithm,Generalized least squares,Non-linear least squares,Backpropagation,Total least squares,Artificial neural network,Mathematics | Conference |
Citations | PageRank | References |
0 | 0.34 | 1 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ai Poh Loh | 1 | 91 | 7.74 |
K. F. Fong | 2 | 5 | 1.52 |