Abstract | ||
---|---|---|
A new nonlinear feature extraction method called kernel Foley-Sammon optimal discriminant vectors (KFSODVs) is presented in this paper. This new method extends the well-known Foley-Sammon optimal discriminant vectors (FSODVs) from linear domain to a nonlinear domain via the kernel trick that has been used in support vector machine (SVM) and other commonly used kernel-based learning algorithms. The proposed method also provides an effective technique to solve the so-called small sample size (SSS) problem which exists in many classification problems such as face recognition. We give the derivation of KFSODV and conduct experiments on both simulated and real data sets to confirm that the KFSODV method is superior to the previous commonly used kernel-based learning algorithms in terms of the performance of discrimination. |
Year | DOI | Venue |
---|---|---|
2005 | 10.1109/TNN.2004.836239 | IEEE Transactions on Neural Networks |
Keywords | Field | DocType |
kernel principal component analysis,face recognition,principal component analysis,kernel method,feature extraction,support vector machine,learning artificial intelligence | Radial basis function kernel,Pattern recognition,Computer science,Kernel embedding of distributions,Kernel Fisher discriminant analysis,Kernel principal component analysis,Polynomial kernel,Artificial intelligence,Kernel method,String kernel,Variable kernel density estimation,Machine learning | Journal |
Volume | Issue | ISSN |
16 | 1 | 1045-9227 |
Citations | PageRank | References |
37 | 1.62 | 18 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wenming Zheng | 1 | 1240 | 80.70 |
Li Zhao | 2 | 380 | 27.36 |
Cairong Zou | 3 | 415 | 27.19 |