Title
Collaborative deep learning across multiple data centers
Abstract
Valuable training data is often owned by independent organizations and located in multiple data centers. Most deep learning approaches require to centralize the multi-datacenter data for performance purpose. In practice, however, it is often infeasible to transfer all data of different organizations to a centralized data center owing to the constraints of privacy regulations. It is very challenging to conduct the geo-distributed deep learning among data centers without the privacy leaks. Model averaging is a conventional choice for data parallelized training and can reduce the risk of privacy leaks, but its ineffectiveness is claimed by previous studies as deep neural networks are often non-convex. In this paper, we argue that model averaging can be effective in the decentralized environment by using two strategies, namely, the cyclical learning rate (CLR) and the increased number of epochs for local model training. With the two strategies, we show that model averaging can provide competitive performance in the decentralized mode compared to the data-centralized one. In a practical environment with multiple data centers, we conduct extensive experiments using state-of-the-art deep network architectures on different types of data. Results demonstrate the effectiveness and robustness of the proposed method.
Year
DOI
Venue
2020
10.1007/s11432-019-2705-2
SCIENCE CHINA-INFORMATION SCIENCES
Keywords
DocType
Volume
collaborative learning,multiple datacenters,distributed machine learning
Journal
63
Issue
ISSN
Citations 
SP8
1674-733X
3
PageRank 
References 
Authors
0.39
25
8
Name
Order
Citations
PageRank
Haibo Mi113412.60
Kele Xu24621.80
Dawei Feng3145.09
Wang Huaimin41025121.31
Yiming Zhang514337.82
Zibin Zheng63731199.37
Chuan Chen7549.82
Xu Lan8132.55