Title
Accelerated Stochastic ADMM with Variance Reduction.
Abstract
Alternating Direction Method of Multipliers (ADMM) is a popular method in solving Machine Learning problems. Stochastic ADMM was firstly proposed in order to reduce the per iteration computational complexity, which is more suitable for big data problems. Recently, variance reduction techniques have been integrated with stochastic ADMM in order to get a fast convergence rate, such as SAG-ADMM and SVRG-ADMM,but the convergence is still suboptimal w.r.t the smoothness constant. In this paper, we propose a new accelerated stochastic ADMM algorithm with variance reduction, which enjoys a faster convergence than all the other stochastic ADMM algorithms. We theoretically analyze its convergence rate and show its dependence on the smoothness constant is optimal. We also empirically validate its effectiveness and show its priority over other stochastic ADMM algorithms.
Year
Venue
Field
2016
arXiv: Numerical Analysis
Convergence (routing),Mathematical optimization,Rate of convergence,Smoothness,Variance reduction,Big data,Mathematics,Computational complexity theory
DocType
Volume
Citations 
Journal
abs/1611.04074
0
PageRank 
References 
Authors
0.34
0
4
Name
Order
Citations
PageRank
Chao Zhang185.49
Zebang Shen2179.36
Hui Qian35913.26
Tengfei Zhou4225.08