Title
TripleSpin - a generic compact paradigm for fast machine learning computations.
Abstract
We present a generic compact computational framework relying on structured random matrices that can be applied to speed up several machine learning algorithms with almost no loss of accuracy. The applications include new fast LSH-based algorithms, efficient kernel computations via random feature maps, convex optimization algorithms, quantization techniques and many more. Certain models of the presented paradigm are even more compressible since they apply only bit matrices. This makes them suitable for deploying on mobile devices. All our findings come with strong theoretical guarantees. In particular, as a byproduct of the presented techniques and by using relatively new Berry-Esseen-type CLT for random vectors, we give the first theoretical guarantees for one of the most efficient existing LSH algorithms based on the $\textbf{HD}_{3}\textbf{HD}_{2}\textbf{HD}_{1}$ structured matrix ("Practical and Optimal LSH for Angular Distance"). These guarantees as well as theoretical results for other aforementioned applications follow from the same general theoretical principle that we present in the paper. Our structured family contains as special cases all previously considered structured schemes, including the recently introduced $P$-model. Experimental evaluation confirms the accuracy and efficiency of TripleSpin matrices.
Year
Venue
Field
2016
arXiv: Learning
Computer science,Matrix (mathematics),Theoretical computer science,Artificial intelligence,Computation,Speedup,Kernel (linear algebra),Mathematical optimization,Algorithm,Angular distance,Quantization (signal processing),Convex optimization,Machine learning,Random matrix
DocType
Volume
Citations 
Journal
abs/1605.09046
1
PageRank 
References 
Authors
0.37
16
6
Name
Order
Citations
PageRank
Krzysztof Choromanski112423.56
Francois Fagan231.46
Cédric Gouy-Pailler36210.69
Anne Morvan431.42
Tamás Sarlós547725.73
Jamal Atif630929.49