Abstract | ||
---|---|---|
We introduce a framework for designing primal methods under the decentralized optimization setting where local functions are smooth and strongly convex. Our approach consists of approximately solving a sequence of sub-problems induced by the accelerated augmented Lagrangian method, thereby providing a systematic way for deriving several well-known decentralized algorithms including EXTRA arXiv:1404.6264 and SSDA arXiv:1702.08704. When coupled with accelerated gradient descent, our framework yields a novel primal algorithm whose convergence rate is optimal and matched by recently derived lower bounds. We provide experimental results that demonstrate the effectiveness of the proposed algorithm on highly ill-conditioned problems. |
Year | Venue | DocType |
---|---|---|
2020 | NIPS 2020 | Conference |
Volume | Citations | PageRank |
33 | 0 | 0.34 |
References | Authors | |
0 | 6 |
Name | Order | Citations | PageRank |
---|---|---|---|
Yossi Arjevani | 1 | 34 | 5.55 |
J. Bruna | 2 | 1697 | 82.95 |
Bugra Can | 3 | 2 | 1.37 |
Mert Gürbüzbalaban | 4 | 55 | 12.36 |
Stefanie Jegelka | 5 | 792 | 46.31 |
Lin Hongzhou | 6 | 0 | 0.34 |