Abstract | ||
---|---|---|
The scale of modern datasets necessitates the development of efficient distributed optimization methods for machine learning. We present a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. We extend the framework to cover general non-strongly-convex regularizers, including L1-regularized problems like lasso, sparse logistic regression, and elastic net regularization, and show how earlier work can be derived as a special case. We provide convergence guarantees for the class of convex regularized loss minimization objectives, leveraging a novel approach in handling non-strongly-convex regularizers and non-smooth loss functions. The resulting framework has markedly improved performance over state-of-the-art methods, as we illustrate with an extensive set of experiments on real distributed datasets. |
Year | Venue | Keywords |
---|---|---|
2017 | JOURNAL OF MACHINE LEARNING RESEARCH | Convex optimization,distributed systems,large-scale machine learning,parallel and distributed algorithms |
DocType | Volume | Issue |
Journal | 18 | 230 |
ISSN | Citations | PageRank |
1532-4435 | 5 | 0.52 |
References | Authors | |
0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Virginia Smith | 1 | 339 | 20.52 |
Simone Forte | 2 | 7 | 0.98 |
Chenxin Ma | 3 | 73 | 5.25 |
Martin Takác | 4 | 752 | 49.49 |
Michael I. Jordan | 5 | 31220 | 3640.80 |
Martin Jaggi | 6 | 852 | 54.16 |