Title
On Convergence of Model Parallel Proximal Gradient Algorithm for Stale Synchronous Parallel System.
Abstract
With ever growing data volume and model size, an error-tolerant, communication efficient, yet versatile parallel algorithm has become a vital part for the success of many large-scale applications. In this work we propose msPG, an extension of the flexible proximal gradient algorithm to the model parallel and stale synchronous setting. The worker machines of msPG operate asynchronously as long as they are not too far apart, and they communicate efficiently through a dedicated parameter server. Theoretically, we provide a rigorous analysis of the various convergence properties of msPG, and a salient feature of our analysis is its seamless generality that allows both nonsmooth and nonconvex functions. Under mild conditions, we prove the whole iterate sequence of msPG converges to a critical point (which is optimal under convexity assumptions). We further provide an economical implementation of msPG, completely bypassing the need of keeping a local full model. We confirm our theoretical findings through numerical experiments.
Year
Venue
Field
2016
JMLR Workshop and Conference Proceedings
Convergence (routing),Mathematical optimization,Computer science,Limit point,Global model,Bounding overwatch
DocType
Volume
ISSN
Conference
51
1938-7288
Citations 
PageRank 
References 
2
0.37
19
Authors
5
Name
Order
Citations
PageRank
Yi Zhou16517.55
Yaoliang Yu266934.33
Wei Dai333312.77
Yingbin Liang41646147.64
Bo Xing57332471.43