Title
On the Impact of Kernel Approximation on Learning Accuracy
Abstract
Kernel approximation is commonly used to scale kernel-based algorithms to applications contain- ing as many as several million instances. This paper analyzes the effect of such approximations in the kernel matrix on the hypothesis generated by several widely used learning algorithms. We give stability bounds based on the norm of the kernel approximation for these algorithms, in- cluding SVMs, KRR, and graph Laplacian-based regularization algorithms. These bounds help de- termine the degree of approximation that can be tolerated in the estimation of the kernel matrix. Our analysis is general and applies to arbitrary approximations of the kernel matrix. However, we also give a specific analysis of the Nystr ¨ om low-rank approximation in this context and re- port the results of experiments evaluating the quality of the Nystrom low-rank kernel approx- imation when used with ridge regression.
Year
Venue
Keywords
2010
AISTATS
graph laplacian,ridge regression,low rank approximation
DocType
Volume
Citations 
Journal
9
51
PageRank 
References 
Authors
1.75
16
3
Name
Order
Citations
PageRank
Corinna Cortes165741120.50
Mehryar Mohri24502448.21
Talwalkar, Ameet3139466.51