Title
An efficient KPCA algorithm based on feature correlation evaluation
Abstract
Classic kernel principal component analysis (KPCA) is less computationally efficient when extracting features from large data sets. In this paper, we propose an algorithm, that is, efficient KPCA (EKPCA), that enhances the computational efficiency of KPCA by using a linear combination of a small portion of training samples, referred to as basic patterns, to approximately express the KPCA feature extractor, that is, the eigenvector of the covariance matrix in the feature extraction. We show that the feature correlation (i.e., the correlation between different feature components) can be evaluated by the cosine distance between the kernel vectors, which are the column vectors in the kernel matrix. The proposed algorithm can be easily implemented. It first uses feature correlation evaluation to determine the basic patterns and then uses these to reconstruct the KPCA model, perform feature extraction, and classify the test samples. Since there are usually many fewer basic patterns than training samples, EKPCA feature extraction is much more computationally efficient than that of KPCA. Experimental results on several benchmark data sets show that EKPCA is much faster than KPCA while achieving similar classification performance.
Year
DOI
Venue
2014
10.1007/s00521-013-1424-9
Neural Computing and Applications
Keywords
DocType
Volume
cosine distance,feature correlation,feature extraction,kernel principal component analysis
Journal
24
Issue
ISSN
Citations 
7-8
1433-3058
0
PageRank 
References 
Authors
0.34
32
4
Name
Order
Citations
PageRank
Zizhu Fan132914.61
Jinghua Wang216222.31
Baogen Xu312219.54
Pengzhi Tang400.34