Title
Unsupervised Learning Of Data Principal Eigenstructure
Abstract
This paper proposes a principal component analysis (PCA) criterion whose optimization yields the principal eigenvectors of the data correlation matrix as well as the associated eigenvalues. The corresponding learning algorithms are deduced for the unsupervised learning of one-layer linear neural networks. The part of the algorithm that estimates the principal eigenvectors turns out to be a version of the Sanger's generalized Hebbian algorithm (GHA) that enjoys adaptive learning rates and fast convergence. The proposed criterion differs with the standard PCA criterions such as Maximum Variance and Minimum MSE in that a) latters provide only the principal eigenvectors, b) their corresponding learning algorithm, namely GHA algorithm, has a fixed learning rate. Simulation results illustrate the fast convergence of the derived algorithm.
Year
Venue
Field
1997
PROGRESS IN CONNECTIONIST-BASED INFORMATION SYSTEMS, VOLS 1 AND 2
Pattern recognition,Computer science,Unsupervised learning,Artificial intelligence,Machine learning
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
3
Name
Order
Citations
PageRank
Mehdi N. Shirazi100.34
Ferdinand Peper236252.94
Hidefumi Sawai36918.04