Title
CoCoA: A General Framework for Communication-Efficient Distributed Optimization.
Abstract
The scale of modern datasets necessitates the development of efficient distributed optimization methods for machine learning. We present a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. We extend the framework to cover general non-strongly-convex regularizers, including L1-regularized problems like lasso, sparse logistic regression, and elastic net regularization, and show how earlier work can be derived as a special case. We provide convergence guarantees for the class of convex regularized loss minimization objectives, leveraging a novel approach in handling non-strongly-convex regularizers and non-smooth loss functions. The resulting framework has markedly improved performance over state-of-the-art methods, as we illustrate with an extensive set of experiments on real distributed datasets.
Year
Venue
Keywords
2017
JOURNAL OF MACHINE LEARNING RESEARCH
Convex optimization,distributed systems,large-scale machine learning,parallel and distributed algorithms
DocType
Volume
Issue
Journal
18
230
ISSN
Citations 
PageRank 
1532-4435
5
0.52
References 
Authors
0
6
Name
Order
Citations
PageRank
Virginia Smith133920.52
Simone Forte270.98
Chenxin Ma3735.25
Martin Takác475249.49
Michael I. Jordan5312203640.80
Martin Jaggi685254.16