Title
DANCE: Distributed Generative Adversarial Networks with Communication Compression
Abstract
AbstractGenerative adversarial networks (GANs) have shown great success in deep representations learning, data generation, and security enhancement. With the development of the Internet of Things, 5th generation wireless systems (5G), and other technologies, the large volume of data collected at the edge of networks provides a new way to improve the capabilities of GANs. Due to privacy, bandwidth, and legal constraints, it is not appropriate to upload all the data to the cloud or servers for processing. Therefore, this article focuses on deploying and training GANs at the edge rather than converging edge data to the central node. To address this problem, we designed a novel distributed learning architecture for GANs, called DANCE. DANCE can adaptively perform communication compression based on the available bandwidth, while supporting both data and model parallelism training of GANs. In addition, inspired by the gossip mechanism and Stackelberg game, a compatible algorithm, AC-GAN is proposed. The theoretical analysis guarantees the convergence of the model and the existence of approximate equilibrium in AC-GAN. Both simulation and prototype system experiments show that AC-GAN can achieve better training effectiveness with less communication overhead than the SOTA algorithms, i.e., FL-GAN and MD-GAN.
Year
DOI
Venue
2022
10.1145/3458929
ACM Transactions on Internet Technology
Keywords
DocType
Volume
Generative adversarial networks, distributed learning, communication compression
Journal
22
Issue
ISSN
Citations 
2
1533-5399
0
PageRank 
References 
Authors
0.34
12
5
Name
Order
Citations
PageRank
Xiongtao Zhang132.10
Xiaomin Zhu2921100.31
Ji Wang319036.75
Weidong Bao4236.49
Laurence T. Yang56870682.61