Title
Communication-Efficient and Byzantine-Robust Distributed Learning With Error Feedback
Abstract
We develop a communication-efficient distributed learning algorithm that is robust against Byzantine worker machines. We propose and analyze a distributed gradient-descent algorithm that performs a simple thresholding based on gradient norms to mitigate Byzantine failures. We show the (statistical) error-rate of our algorithm matches that of Yin et al. (2018), which uses more comp...
Year
DOI
Venue
2021
10.1109/JSAIT.2021.3105076
IEEE Journal on Selected Areas in Information Theory
Keywords
DocType
Volume
Convergence,Error analysis,Compressors,Manganese,Information theory,Resilience,Neural networks
Journal
2
Issue
Citations 
PageRank 
3
0
0.34
References 
Authors
0
5
Name
Order
Citations
PageRank
Avishek Ghosh13211.35
Raj Kumar Maity201.69
Swanand Kadhe35411.22
Arya Mazumdar430741.81
Kannan Ramchandran594011029.57