Abstract | ||
---|---|---|
Kernel methods are widespread in machine learning; however, they are limited by the quadratic complexity of the construction, application, and storage of kernel matrices. Low-rank matrix approximation algorithms are widely used to address this problem and reduce the arithmetic and storage cost. However, we observed that for some datasets with wide intraclass variability, the optimal kernel parameter for smaller classes yields a matrix that is less well-approximated by low-rank methods. In this paper, we propose an efficient structured low-rank approximation method the block basis factorization (BBF)-and its fast construction algorithm to approximate radial basis function kernel matrices. Our approach has linear memory cost and floating point operations for many machine learning kernels. BBF works for a wide range of kernel bandwidth parameters and extends the domain of applicability of low-rank approximation methods significantly. Our empirical results demonstrate the stability and superiority over the state-of-the-art kernel approximation algorithms. |
Year | DOI | Venue |
---|---|---|
2019 | 10.1137/18M1212586 | SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS |
Keywords | Field | DocType |
kernel matrix,low-rank approximation,data-sparse representation,machine learning,high-dimensional data,RBF | Kernel (linear algebra),Mathematical optimization,Clustering high-dimensional data,Quadratic complexity,Matrix (mathematics),Algorithm,Low-rank approximation,Factorization,Kernel method,Mathematics,Scalability | Journal |
Volume | Issue | ISSN |
40 | 4 | 0895-4798 |
Citations | PageRank | References |
0 | 0.34 | 0 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
Ruoxi Wang | 1 | 6 | 3.14 |
Yingzhou Li | 2 | 4 | 2.80 |
Michael W. Mahoney | 3 | 3297 | 218.10 |
Eric Darve | 4 | 440 | 44.79 |