Title
Joint Consensus Matrix Design and Resource Allocation for Decentralized Learning
Abstract
In decentralized machine learning over a network of workers, each worker updates its local model as a weighted average of its local model and all models received from its neighbors. Efficient consensus weight matrix design and communication resource allocation can increase the training convergence rate and reduce the wall-clock training time. In this paper, we jointly consider these two factors and propose a novel algorithm termed Communication-Efficient Network Topology (CENT), which reduces the latency in each training iteration by removing unnecessary communication links. CENT preserves the training convergence rate while enforcing communication graph sparsity and avoiding selecting poor communication links. Numerical study with real-world machine learning data demonstrates the efficacy of the proposed solution and its performance advantage over state-of-the-art algorithms.
Year
DOI
Venue
2022
10.23919/IFIPNetworking55013.2022.9829798
2022 IFIP Networking Conference (IFIP Networking)
Keywords
DocType
ISBN
joint consensus matrix design,decentralized learning,decentralized machine,weighted average,communication resource allocation,training convergence rate,wall-clock training time,Communication-Efficient Network Topology,training iteration,unnecessary communication links,communication graph sparsity,avoiding selecting poor communication links
Conference
978-1-6654-8726-9
Citations 
PageRank 
References 
0
0.34
16
Authors
5
Name
Order
Citations
PageRank
Jingrong Wang100.34
Ben Liang22589204.57
Zhongwen Zhu300.34
Emmanuel Thepie Fapi400.34
Hardik Dalal500.34