Abstract | ||
---|---|---|
One of the many possible conditions for pattern storage in a Hopfield net is to demand that the local field vector be a pattern reconstruction. We use this criterion to derive a set of weights for the storage of correlated biased patterns in a fully connected net. The connections are built from the eigenvectors or principal components of the pattern correlation matrix. Since these are often identified with the features of a pattern set we have named this particular set of weights as the feature matrix. We present simulation results that show the feature matrix to be capable of storing up to N random patterns in a network of N spins. Basins of attraction are also investigated via simulation and we compare them with both our theoretical analysis and those of the pseudo-inverse rule. A statistical mechanical investigation using the replica trick confirms the result for storage capacity. Finally we discuss a biologicaly plausible learning rule capable of realising the feature matrix in a fully connected net . Copyright © 1996 Elsevier Science Ltd |
Year | DOI | Venue |
---|---|---|
1996 | 10.1016/0893-6080(95)00113-1 | Neural Networks |
Keywords | Field | DocType |
statistical mechanics,principal component analysis,hopfield networks,principal component,hopfield network,eigenvectors,local field,correlation matrix | Statistical mechanics,Computer science,Algorithm,Learning rule,Artificial intelligence,Covariance matrix,Artificial neural network,Hopfield network,Replica trick,Principal component analysis,Machine learning,Eigenvalues and eigenvectors | Journal |
Volume | Issue | ISSN |
9 | 5 | Neural Networks |
Citations | PageRank | References |
2 | 0.50 | 3 |
Authors | ||
2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Stephen Coombes | 1 | 184 | 18.30 |
J. G. Taylor | 2 | 2 | 0.50 |