Abstract | ||
---|---|---|
With the goal of accelerating the training and testing complexity of nonlinear kernel methods, several recent papers have proposed explicit embeddings of the input data into low-dimensional feature spaces, where fast linear methods can instead be used to generate approximate solutions. Analogous to random Fourier feature maps to approximate shift-invariant kernels, such as the Gaussian kernel, on ℝd, we develop a new randomized technique called random Laplace features, to approximate a family of kernel functions adapted to the semigroup structure of ℝ+d. This is the natural algebraic structure on the set of histograms and other non-negative data representations. We provide theoretical results on the uniform convergence of random Laplace features. Empirical analyses on image classification and surveillance event detection tasks demonstrate the attractiveness of using random Laplace features relative to several other feature maps proposed in the literature. |
Year | DOI | Venue |
---|---|---|
2014 | 10.1109/CVPR.2014.129 | CVPR |
Keywords | Field | DocType |
low-dimensional feature spaces,histograms,approximate solutions,surveillance event detection,nonnegative data representations,shift-invariant kernels,fourier transforms,data structures,surveillance,image classification,random fourier feature maps,semigroup kernels,natural algebraic structure,random laplace feature maps,group theory,laplace transforms,accuracy,feature extraction,kernel | Histogram,Computer science,Uniform convergence,Fourier transform,Artificial intelligence,Gaussian function,Discrete mathematics,Pattern recognition,Laplace transform,Algorithm,Kernel method,Semigroup,Kernel (statistics) | Conference |
ISSN | Citations | PageRank |
1063-6919 | 16 | 0.81 |
References | Authors | |
13 | 5 |
Name | Order | Citations | PageRank |
---|---|---|---|
Jiyan Yang | 1 | 68 | 7.93 |
Vikas Sindhwani | 2 | 3423 | 154.85 |
Quanfu Fan | 3 | 504 | 32.69 |
Avron, Haim | 4 | 316 | 28.52 |
Michael W. Mahoney | 5 | 3297 | 218.10 |