Abstract | ||
---|---|---|
Generalized discriminant analysis (GDA) is an extension of the classical linear discriminant analysis (LDA) from linear domain to a nonlinear domain via the kernel trick. However, in the previous algorithm of GDA, the solutions may suffer from the degenerate eigenvalue problem (i.e., several eigenvectors with the same eigenvalue), which makes them not optimal in terms of the discriminant ability. In this letter, we propose a modified algorithm for GDA (MGDA) to solve this problem. The MGDA method aims to remove the degeneracy of GDA and find the optimal discriminant solutions, which maximize the between-class scatter in the subspace spanned by the degenerate eigenvectors of GDA. Theoretical analysis and experimental results on the ORL face database show that the MGDA method achieves better performance than the GDA method. |
Year | DOI | Venue |
---|---|---|
2004 | 10.1162/089976604773717612 | Neural Computation |
Keywords | Field | DocType |
generalized discriminant analysis,eigenvalues,eigenvectors | Optimal discriminant analysis,Mathematical optimization,Subspace topology,Discriminant,Kernel Fisher discriminant analysis,Algorithm,Degeneracy (mathematics),Linear discriminant analysis,Kernel method,Mathematics,Eigenvalues and eigenvectors | Journal |
Volume | Issue | ISSN |
16 | 6 | 0899-7667 |
Citations | PageRank | References |
21 | 1.82 | 8 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Wenming Zheng | 1 | 1240 | 80.70 |
Li Zhao | 2 | 380 | 27.36 |
Cairong Zou | 3 | 415 | 27.19 |