Title
A least squares formulation for a class of generalized eigenvalue problems in machine learning
Abstract
Many machine learning algorithms can be formulated as a generalized eigenvalue problem. One major limitation of such formulation is that the generalized eigenvalue problem is computationally expensive to solve especially for large-scale problems. In this paper, we show that under a mild condition, a class of generalized eigenvalue problems in machine learning can be formulated as a least squares problem. This class of problems include classical techniques such as Canonical Correlation Analysis (CCA), Partial Least Squares (PLS), and Linear Discriminant Analysis (LDA), as well as Hypergraph Spectral Learning (HSL). As a result, various regularization techniques can be readily incorporated into the formulation to improve model sparsity and generalization ability. In addition, the least squares formulation leads to efficient and scalable implementations based on the iterative conjugate gradient type algorithms. We report experimental results that confirm the established equivalence relationship. Results also demonstrate the efficiency and effectiveness of the equivalent least squares formulations on large-scale problems.
Year
DOI
Venue
2009
10.1145/1553374.1553499
ICML
Keywords
Field
DocType
hypergraph spectral learning,linear discriminant analysis,large-scale problem,established equivalence relationship,machine learning,generalized eigenvalue problem,squares problem,classical technique,canonical correlation analysis,squares formulation,least square,eigenvalues
Least squares,Least squares support vector machine,Generalized least squares,Iteratively reweighted least squares,Artificial intelligence,Divide-and-conquer eigenvalue algorithm,Non-linear least squares,Total least squares,Recursive least squares filter,Machine learning,Mathematics
Conference
Citations 
PageRank 
References 
21
0.83
14
Authors
3
Name
Order
Citations
PageRank
Liang Sun150024.61
Shuiwang Ji22579122.25
Jieping Ye36943351.37