Title | ||
---|---|---|
Distributed Linearized Alternating Direction Method of Multipliers for Composite Convex Consensus Optimization |
Abstract | ||
---|---|---|
Given an undirected graph G = (N, E) of agents N = {1,..., N} connected with edges in E, we study how to compute an optimal decision on which there is consensus among agents and that minimizes the sum of agent-specific private convex composite functions { i } iN , where i i + f i belongs to agent-i. Assuming only agents connected by an edge can communicate, we propose a distributed proximal gradient algorithm (DPGA) for consensus optimization over both unweighted and weighted static (undirected) communication networks. In one iteration, each agent-i computes the prox map of i and gradient of f i , and this is followed by local communication with neighboring agents. We also study its stochastic gradient variant, SDPGA, which can only access to noisy estimates of f i at each agent-i. This computational model abstracts a number of applications in distributed sensing, machine learning and statistical inference. We show ergodic convergence in both suboptimality error and consensus violation for the DPGA and SDPGA with rates O(1/t) and O(1/t), respectively. |
Year | DOI | Venue |
---|---|---|
2018 | 10.1109/TAC.2017.2713046 | IEEE Trans. Automat. Contr. |
Keywords | Field | DocType |
Convex functions,Distributed algorithms,Convergence,Optimization,Distributed databases,Memory management,Network topology | Convergence (routing),Discrete mathematics,Graph,Mathematical optimization,Combinatorics,Nabla symbol,Ergodic theory,Proximal Gradient Methods,Regular polygon,Mathematics | Journal |
Volume | Issue | ISSN |
63 | 1 | 0018-9286 |
Citations | PageRank | References |
23 | 0.75 | 29 |
Authors | ||
4 |
Name | Order | Citations | PageRank |
---|---|---|---|
N. S. Aybat | 1 | 89 | 10.49 |
James Z. Wang | 2 | 7526 | 403.00 |
tianyi lin | 3 | 23 | 0.75 |
Shiqian Ma | 4 | 1068 | 63.48 |