Title
Improving the Privacy and Accuracy of ADMM-Based Distributed Algorithms.
Abstract
Alternating direction method of multiplier (ADMM) is a popular method used to design distributed versions of a machine learning algorithm, whereby local computations are performed on local data with the output exchanged among neighbors in an iterative fashion. During this iterative process the leakage of data privacy arises. A differentially private ADMM was proposed in prior work (Zhang u0026 Zhu, 2017) where only the privacy loss of a single node during one iteration was bounded, a method that makes it difficult to balance the tradeoff between the utility attained through distributed computation and privacy guarantees when considering the total privacy loss of all nodes over the entire iterative process. We propose a perturbation method for ADMM where the perturbed term is correlated with the penalty parameters; this is shown to improve the utility and privacy simultaneously. The method is based on a modified ADMM where each node independently determines its own penalty parameter in every iteration and decouples it from the dual updating step size. The condition for convergence of the modified ADMM and the lower bound on the convergence rate are also derived.
Year
Venue
DocType
2018
ICML
Journal
Volume
Citations 
PageRank 
abs/1806.02246
4
0.42
References 
Authors
12
3
Name
Order
Citations
PageRank
Xueru Zhang1105.31
Mohammad Mahdi Khalili2215.19
Mingyan Liu32569224.92