Title
Differentially Private Gossip Gradient Descent
Abstract
In this paper, we study a problem of learning a linear regression model distributively with a network of N interconnected agents in which each agent can deploy an online learning algorithm to adaptively learn the regression model using its private data. The goal of the problem is to devise a distributed algorithm, under the constraint that each agent can communicate only with its neighbors depicted by a connected communication graph, which enables all N agents converge to the true model, with a performance comparable to that of conventional centralized algorithms. We propose a differentially private distributed algorithm, called private gossip gradient descent, and establish epsilon-differential privacy and O(root log(2)t/epsilon(1-lambda(2))Nt) convergence, where lambda(2) is the second largest eigenvalue of the expected gossip matrix corresponding to the communication graph.
Year
DOI
Venue
2018
10.1109/CDC.2018.8619437
2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC)
Field
DocType
ISSN
Convergence (routing),Discrete mathematics,Mathematical optimization,Gradient descent,Computer science,Matrix (mathematics),Gossip,Distributed algorithm,Eigenvalues and eigenvectors,Linear regression,Lambda
Conference
0743-1546
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Yang Liu13211.55
Ji Liu214626.61
Tamer Basar33497402.11