Title
Towards Federated Learning against Noisy Labels via Local Self-Regularization
Abstract
ABSTRACTFederated learning (FL) aims to learn joint knowledge from a large scale of decentralized devices with labeled data in a privacy-preserving manner. However, data with noisy labels are ubiquitous in reality since high-quality labeled data require expensive human efforts, which cause severe performance degradation. Although a lot of methods are proposed to directly deal with noisy labels, these methods either require excessive computation overhead or violate the privacy protection principle of FL. To this end, we focus on this issue in FL with the purpose of alleviating performance degradation yielded by noisy labels meanwhile guaranteeing data privacy. Specifically, we propose a Local Self-Regularization method, which effectively regularizes the local training process via implicitly hindering the model from memorizing noisy labels and explicitly narrowing the model output discrepancy between original and augmented instances using self distillation. Experimental results demonstrate that our proposed method can achieve notable resistance against noisy labels in various noise levels on three benchmark datasets. In addition, we integrate our method with existing state-of-the-arts and achieve superior performance on the real-world dataset Clothing1M.The code is available at https://github.com/Sprinter1999.
Year
DOI
Venue
2022
10.1145/3511808.3557475
Conference on Information and Knowledge Management
DocType
Citations 
PageRank 
Conference
0
0.34
References 
Authors
0
4
Name
Order
Citations
PageRank
Xuefeng Jiang100.34
Sheng Sun200.34
Yuwei Wang300.34
Min Liu433540.49