Title
Feedforward Hebbian learning with nonlinear output units: a Lyapunov approach
Abstract
A Lyapunov function is constructed for the unsupervised learning equations of a large class of neural networks. These networks have a single layer of adjustable connections; units in the output layer are recurrently connected with fixed symmetric weights. The constructed function is similar in form to that derived by Cohen-Grossberg and Hopfield. Two theorems are proved regarding the location of stable equilibria in the limit of high gain transfer functions. The analysis is applied to the soft competitive learning networks of Amari and Takeuchi.
Year
DOI
Venue
1996
10.1016/0893-6080(95)00044-5
Neural Networks
Keywords
DocType
Volume
competitive learning,unsupervised learning,nonlinear output unit,principal component analysis,correlational learning,feedforward hebbian,high gain sigmoid,lyapunov functions,lyapunov approach,saturation,neural network,lyapunov function,hebbian learning,transfer function
Journal
9
Issue
ISSN
Citations 
2
Neural Networks
0
PageRank 
References 
Authors
0.34
13
1
Name
Order
Citations
PageRank
Todd W Troyer19027.69