Abstract | ||
---|---|---|
Oja's equations describe a well-studied system for unsupervised Hebbian learning of principal components. This paper derives the explicit time-domain solution of Oja's equations for the single-neuron case. It also shows that, under a linear change of coordinates, these equations are a gradient system in the general multi-neuron case. This latter result leads to a new Lyapunov-like function for Oja's equations. |
Year | DOI | Venue |
---|---|---|
1995 | 10.1162/neco.1995.7.5.915 | Neural Computation |
Keywords | Field | DocType |
linear change ofcoordinates,well-studied system,thesingle-neuron case,unsupervisedhebbian learning,theexplicit time-domain solution,principal component,generalmulti-neuron case,latter result,time-domain solution,new lyapunov-likefunction,gradient system,hebbian learning,time domain | Time domain,Applied mathematics,Lyapunov function,Mathematical optimization,Matrix (mathematics),Oja's rule,Hebbian theory,Artificial intelligence,Artificial neural network,Generalized Hebbian Algorithm,Principal component analysis,Mathematics | Journal |
Volume | Issue | ISSN |
7 | 5 | 0899-7667 |
Citations | PageRank | References |
6 | 1.05 | 5 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
J. L. Wyatt, Jr. | 1 | 25 | 13.34 |
Ibrahim M. Elfadel | 2 | 153 | 40.15 |