Title
Fast and scalable polynomial kernels via explicit feature maps
Abstract
Approximation of non-linear kernels using random feature mapping has been successfully employed in large-scale data analysis applications, accelerating the training of kernel machines. While previous random feature mappings run in O(ndD) time for $n$ training samples in d-dimensional space and D random feature maps, we propose a novel randomized tensor product technique, called Tensor Sketching, for approximating any polynomial kernel in O(n(d+D \log{D})) time. Also, we introduce both absolute and relative error bounds for our approximation to guarantee the reliability of our estimation algorithm. Empirically, Tensor Sketching achieves higher accuracy and often runs orders of magnitude faster than the state-of-the-art approach for large-scale real-world datasets.
Year
DOI
Venue
2013
10.1145/2487575.2487591
KDD
Keywords
Field
DocType
explicit feature map,large-scale real-world datasets,kernel machine,d random feature map,tensor sketching,previous random feature,large-scale data analysis application,non-linear kernel,random feature mapping,scalable polynomial kernel,polynomial kernel,training sample,svm,fft,tensor product
Kernel (linear algebra),Tensor product,Discrete mathematics,Tensor,Polynomial,Support vector machine,Polynomial kernel,Fast Fourier transform,Artificial intelligence,Approximation error,Machine learning,Mathematics
Conference
Citations 
PageRank 
References 
76
2.16
19
Authors
2
Name
Order
Citations
PageRank
Ninh Pham11697.68
Rasmus Pagh2134486.08