Title
Federated Learning Method Based on Knowledge Distillation and Deep Gradient Compression
Abstract
Federated learning is a new type of multi-agency collaborative training model paradigm, which is widely used in many fields, among which communication overhead is a key issue. In order to reduce the amount of data transmitted in the communication process, we propose a federated learning algorithm based on knowledge distillation and deep gradient compression (Fed-KDDGC-SGD). First, we use local dat...
Year
DOI
Venue
2021
10.1109/CCIS53392.2021.9754651
2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)
Keywords
DocType
ISBN
Training,Solid modeling,Conferences,Computational modeling,Collaboration,Collaborative work,Data models
Conference
978-1-6654-4149-0
Citations 
PageRank 
References 
0
0.34
0
Authors
5
Name
Order
Citations
PageRank
Haiyan Cui100.68
Junping Du278991.80
Yang Jiang300.34
Yue Wang448638.99
Runyu Yu500.34