Title
Weighted principal component analysis
Abstract
In this paper, we proposed a weighted PCA (WPCA) method. This method first uses the distances between the test sample and each training sample to calculate the 'weighted' covariance matrix. It then exploits the obtained covariance matrix to perform feature extraction. The experimental results show that the proposed method can obtain a high accuracy than conventional PCA. WPCA has the underlying theoretical foundation: through the 'weighted' covariance matrix, WPCA takes emphasis on the training samples that are very close to the test sample and reduce the influence of the other training samples. As a result, it is likely that the test sample is easier to be classified into the same class as the training samples that are very close to it. The experimental results show the feasibility and effectiveness of WPCA.
Year
DOI
Venue
2011
10.1007/978-3-642-23896-3_70
AICI (3)
Keywords
Field
DocType
covariance matrix,feature extraction,weighted principal component analysis,weighted pca,test sample,underlying theoretical foundation,conventional pca,high accuracy,training sample,face recognition,dimensionality reduction,eigenvectors
Facial recognition system,Dimensionality reduction,Pattern recognition,Computer science,Feature extraction,Artificial intelligence,Covariance matrix,Eigenvalues and eigenvectors,Principal component analysis,Machine learning
Conference
Volume
ISSN
Citations 
7004
0302-9743
3
PageRank 
References 
Authors
0.42
11
3
Name
Order
Citations
PageRank
Zizhu Fan132914.61
Ergen Liu2282.55
Baogen Xu312219.54