Abstract | ||
---|---|---|
This paper proposes a local PCA-SOM algorithm. The new competition measure is computational efficient, and implicitly incorporates the Mahalanobis distance and the reconstruction error. The matrix inversion or PCA decomposition for each data input is not needed as compared to the previous models. Moreover, the local data distribution is completely stored in the covariance matrix instead of the pre-defined numbers of the principal components. Thus, no priori information of the optimal principal subspace is required. Experiments on both the synthesis data and a pattern learning task are carried out to show the performance of the proposed method. |
Year | DOI | Venue |
---|---|---|
2008 | 10.1016/j.neucom.2007.10.004 | Neurocomputing |
Keywords | Field | DocType |
local data distribution,local principal component analysis,neural networks,local pca-som algorithm,mahalanobis distance,pca decomposition,principal component,matrix inversion,optimal principal subspace,new local pca-som algorithm,synthesis data,self-organizing mapping,data input,unsupervised learning,covariance matrix,neural network,principal component analysis | Matrix (mathematics),Mahalanobis distance,Unsupervised learning,Artificial intelligence,Artificial neural network,Sparse PCA,Pattern recognition,Subspace topology,Algorithm,Covariance matrix,Principal component analysis,Mathematics,Machine learning | Journal |
Volume | Issue | ISSN |
71 | 16-18 | Neurocomputing |
Citations | PageRank | References |
8 | 0.77 | 14 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Dong Huang | 1 | 163 | 14.20 |
Zhang Yi | 2 | 1765 | 194.41 |
Xiaorong Pu | 3 | 85 | 11.17 |