Title
A consensus-based decentralized training algorithm for deep neural networks with communication compression
Abstract
Facing the challenge of distributed computing on processing large-scale data, this paper proposes a consensus-based decentralized training method with communication compression. First, the decentralized training method is designed based on the decentralized topology to reduce the communication burden on the busiest agent and avoid any agent revealing its locally stored data. The convergence of the decentralized training algorithm is then analyzed, which demonstrates that the decentralized trained model can reach the minimal empirical risk on the whole dataset, without the sharing of data samples. Furthermore, model compression combined with the error-compensated method is considered to reduce communication costs during the decentralized training process. At last, the simulation study shows that the proposed decentralized training with error-compensated communication compression is applicable for both IID and non-IID datasets, and exhibits much better performance than the local training method. Besides, the proposed algorithm with an appropriate compression rate shows comparable performance with decentralized training and centralized training, while saving a lot of communication costs.
Year
DOI
Venue
2021
10.1016/j.neucom.2021.01.020
Neurocomputing
Keywords
DocType
Volume
Decentralized training,Consensus,Model compression,Neural network,Decentralized communication topology,Convergence
Journal
440
ISSN
Citations 
PageRank 
0925-2312
0
0.34
References 
Authors
0
2
Name
Order
Citations
PageRank
Bo Liu101.35
Zhengtao Ding2315.53