Title
Layer-Based Communication-Efficient Federated Learning with Privacy Preservation
Abstract
In recent years, federated learning has attracted more and more attention as it could collaboratively train a global model without gathering the users' raw data. It has brought many challenges. In this paper, we proposed layer-based federated learning system with privacy preservation. We successfully reduced the communication cost by selecting several layers of the model to upload for global averaging and enhanced the privacy protection by applying local differential privacy. We evaluated our system in non independently and identically distributed scenario on three datasets. Compared with existing works, our solution achieved better performance in both model accuracy and training time.
Year
DOI
Venue
2022
10.1587/transinf.2021BCP0006
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS
Keywords
DocType
Volume
federated learning, privacy preservation, parameter selection, communication-efficient, non-IID data
Journal
E105D
Issue
ISSN
Citations 
2
1745-1361
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Zhuotao Lian121.38
Weizheng Wang201.35
Huakun Huang300.68
Chunhua Su401.01