Title
Optimizing Federated Edge Learning on Non-IID Data via Neural Architecture Search
Abstract
To exploit the vast amount of distributed data across edge devices, Federated Learning (FL) has been proposed to learn a shared model by performing distributed training locally on participating devices and aggregating the local models into a global one. The existing FL algorithms suffer from accuracy loss due to that data samples across all devices are usually not independent and identically distributed (non-i.i.d.). Besides, devices might lose connection during the training process in wireless edge computing. Thus, we advocate one-shot Neural Architecture Search technique as a basis to propose a solution which can deal with non-i.i.d. problem and is robust to the intermittent connection issue. We adopt a large network as the global model which includes all the candidate network architectures. The non-i.i.d. problem is alleviated by two steps: (1) identify and train the candidate networks which are potentially high performance and trained with less bias using a heuristic sampling scheme; (2) search for the final model with the highest accuracy rate from the candidate networks. Experimental results show that the model trained by our proposed method is robust to non-i.i.d. problem and can achieve 84% reduced communication overhead compared with the baselines.
Year
DOI
Venue
2021
10.1109/GLOBECOM46510.2021.9685909
2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM)
DocType
ISSN
Citations 
Conference
2334-0983
0
PageRank 
References 
Authors
0.34
0
6
Name
Order
Citations
PageRank
Feifei Zhang100.34
JiDong Ge211928.39
Chifong Wong361.46
Sheng Zhang44415.62
Chuanyi Li500.34
Bin Luo66621.04