Title
GNSD: a Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization
Abstract
In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate O(1/√T) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.
Year
DOI
Venue
2019
10.1109/DSW.2019.8755807
2019 IEEE Data Science Workshop (DSW)
Keywords
Field
DocType
Stochastic,decentralized,gradient tracking,nonconvex optimization,neural networks
Control theory,Stochastic gradient descent,Computer science,Algorithm,Stationary point,Rate of convergence,Artificial neural network,Big data,Optimization problem,Computational resource
Conference
ISBN
Citations 
PageRank 
978-1-7281-0709-7
0
0.34
References 
Authors
12
4
Name
Order
Citations
PageRank
Songtao Lu18419.52
xinwei zhang261.53
Haoran Sun3534.14
Mingyi Hong4153391.29