Title
Adaptive Batchsize Selection And Gradient Compression For Wireless Federated Learning
Abstract
In wireless federated learning system, wireless communication and local computation have a significant impact on the learning latency due to the limited bandwidth and computing power of mobile devices. To reduce the learning latency, local stochastic gradient methods and gradient compression can be applied, which however would decrease the convergence rate. To tackle such issues, in this paper, the trade-off between the convergence rate and the learning latency is taken into account. We first formulate an optimization problem to maximize the convergence rate under the given training latency constraint via jointly optimizing the batchsize, compression ratio, and spectrum allocation. Then, by decomposing the problem into two subproblems, an adaptive algorithm is proposed to obtain the optimal solution. The results show that batchsize and compression ratio should be selected according to the computing power and channel state information of the devices to improve the convergence rate. Finally, experimental results are presented to verify the effectiveness of the proposed algorithm.
Year
DOI
Venue
2020
10.1109/GLOBECOM42002.2020.9322122
2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM)
DocType
ISSN
Citations 
Conference
2334-0983
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Shengli Liu131.74
Guanding Yu21287101.15
Rui Yin312911.38
Jiantao Yuan400.68
Fengzhong Qu519619.02