Abstract | ||
---|---|---|
Canonical Correlation Analysis [3] is used when we have two data sets which we believe have some underlying correlation. In this paper, we derive a new family of neural methods for finding the canonical correlation directions by solving a generalized eigenvalue problem. Based on the differential equation for the generalized eigenvalue problem, a family of CCA learning algorithms can be obtained. We compare our family of methods with a previously derived [2] CCA learning algorithm. Our results show that all the new learning algorithms of this family have the same order of convergence speed and in particular are much faster than existing algorithms; they are also shown to be able to find greater nonlinear correlations. They are also much more robust with respect to parameter selection. |
Year | DOI | Venue |
---|---|---|
2000 | 10.1007/3-540-44491-2_25 | IDEAL |
Keywords | Field | DocType |
convergence speed,greater nonlinear correlation,differential equation,new family,generalised canonical correlation analysis,generalized eigenvalue problem,canonical correlation direction,underlying correlation,new learning algorithm,canonical correlation analysis,canonical correlation,order of convergence | Discrete mathematics,Differential equation,Applied mathematics,Combinatorics,Nonlinear system,Lagrange multiplier,Canonical correlation,Correlation,Rate of convergence,Eigendecomposition of a matrix,Artificial neural network,Mathematics | Conference |
ISBN | Citations | PageRank |
3-540-41450-9 | 1 | 0.35 |
References | Authors | |
3 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Zhenkun Gou | 1 | 53 | 3.56 |
Colin Fyfe | 2 | 508 | 55.62 |