Title
Vqsgd: Vector Quantized Stochastic Gradient Descent
Abstract
In this work, we present a family of vector quantization schemes vqSGD (Vector-Quantized Stochastic Gradient Descent) that provide an asymptotic reduction in the communication cost with convergence guarantees in first-order distributed optimization. In the process we derive the following fundamental information theoretic fact: Theta(d/R-2) bits are necessary and sufficient (up to an additive O(log d) term) to describe an unbiased estimator (g) over cap (g) for any g in the d-dimensional unit sphere, under the constraint that parallel to(g) over cap (g)parallel to(2) <= R almost surely. In particular, we consider a randomized scheme based on the convex hull of a point set, that returns an unbiased estimator of a d-dimensional gradient vector with almost surely bounded norm. We provide multiple efficient instances of our scheme, that are near optimal, and require only o(d) bits of communication at the expense of tolerable increase in error. The instances of our quantization scheme are obtained using the properties of binary error-correcting codes and provide a smooth tradeo. between the communication and the estimation error of quantization. Furthermore, we show that vqSGD also offers some automatic privacy guarantees.
Year
Venue
DocType
2021
24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS)
Conference
Volume
ISSN
Citations 
130
2640-3498
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Venkata Gandikota154.46
Daniel M. Kane274361.43
Raj Kumar Maity301.69
Arya Mazumdar430741.81