Title
Communication-Efficient Distributed Online Learning with Kernels.
Abstract
We propose an efficient distributed online learning protocol for low-latency real-time services. It extends a previously presented protocol to kernelized online learners that represent their models by a support vector expansion. While such learners often achieve higher predictive performance than their linear counterparts, communicating the support vector expansions becomes inefficient for large numbers of support vectors. The proposed extension allows for a larger class of online learning algorithms—including those alleviating the problem above through model compression. In addition, we characterize the quality of the proposed protocol by introducing a novel criterion that requires the communication to be bounded by the loss suffered.
Year
Venue
Field
2016
ECML/PKDD
Online learning,Computer science,Support vector machine,Artificial intelligence,Model compression,Machine learning,Bounded function
DocType
ISSN
Citations 
Conference
Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2016
1
PageRank 
References 
Authors
0.36
9
4
Name
Order
Citations
PageRank
Michael Kamp1175.80
Sebastian Bothe2394.55
Mario Boley322117.08
Michael Mock417525.30