Title
Understanding adaptive gradient clipping in DP-SGD, empirically
Abstract
Differentially Private Stochastic Gradient Descent (DP-SGD) is a prime method for training machine learning models with rigorous privacy guarantees. Since its birth, DP-SGD has gained popularity and has been widely adopted in both academic and industrial research. One well-known challenge when using DP-SGD is how to improve utility while maintaining privacy. To this end, recently we have seen several proposals that clip the gradients with adaptive thresholds rather than a fixed one. Although each proposal comes with some theoretical justification, the theories often rely on strong assumptions and are not compatible with each other. It is hard to know whether they are good in practice and how good they are. In this paper, we investigate adaptive clipping in DP-SGD from an empirical perspective. With extensive experiments, we were able to gain some fresh insights and proposed two new adaptive clipping strategies based on them. We cross-compared the existing methods and our new strategies experimentally. Results showed that our strategies did provide a substantial improvement in model accuracy, and outperformed the state-of-the-art adaptive clipping methods consistently.
Year
DOI
Venue
2022
10.1002/int.23001
INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS
Keywords
DocType
Volume
adaptive gradient clipping, differentially private stochastic gradient descent, machine learning, privacy protection
Journal
37
Issue
ISSN
Citations 
11
0884-8173
0
PageRank 
References 
Authors
0.34
0
7
Name
Order
Citations
PageRank
Guanbiao Lin100.68
Hongyang Yan200.34
Guang Kou300.34
Teng Huang400.34
Shiyu Peng500.68
Yingying Zhang600.34
Changyu Dong700.34