Title
Toward Communication-Efficient Federated Learning in the Internet of Things With Edge Computing
Abstract
Federated learning is an emerging concept that trains the machine learning models with the local distributed data sets, without sending the raw data to the data center. But, in the Internet of Things (IoT) where the wireless network resource is constrained, the key problem of federated learning is the communication overhead for parameter synchronization, which wastes bandwidth, increases training time, and even impacts the model accuracy. Gradient sparsification has received increasing attention, which only updates significant gradients and accumulates insignificant gradients locally. However, how to preserve the accuracy after a high ratio sparsification has been ignored in the literature. In this article, a general gradient sparsification (GGS) framework is proposed for adaptive optimizers, to correct the sparse gradient update process. It consists of two important mechanisms: 1) gradient correction and 2) batch normalization (BN) update with local gradients. With gradient correction, the optimizer can properly treat the accumulated insignificant gradients, which makes the model converge better. Furthermore, updating the BN layer with local gradients can relieve the impact of delayed gradients without increasing the communication overhead. We have conducted experiments on LeNet-5, CifarNet, DenseNet-121, and AlexNet with adaptive optimizers. Results show that when 99.9% gradients are sparsified, validation data sets are maintained with top-1 accuracy.
Year
DOI
Venue
2020
10.1109/JIOT.2020.2994596
IEEE Internet of Things Journal
Keywords
DocType
Volume
Adaptive optimizer,deep learning,edge computing,federated learning,gradient sparsification
Journal
7
Issue
ISSN
Citations 
11
2327-4662
6
PageRank 
References 
Authors
0.40
0
6
Name
Order
Citations
PageRank
Haifeng Sun16827.77
Shiqi Li290.76
Fei Yu35116335.58
Qi Qi421056.01
J. Wang547995.23
Jianxin Liao645782.08