Title
SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization.
Abstract
We propose a new algorithm for minimizing regularized empirical loss: Stochastic Dual Newton Ascent (SDNA). Our method is dual in nature: in each iteration we update a random subset of the dual variables. However, unlike existing methods such as stochastic dual coordinate ascent, SDNA is capable of utilizing all curvature information contained in the examples, which leads to striking improvements in both theory and practice - sometimes by orders of magnitude. In the special case when an L2-regularizer is used in the primal, the dual problem is a concave quadratic maximization problem plus a separable term. In this regime, SDNA in each step solves a proximal subproblem involving a random principal submatrix of the Hessian of the quadratic function; whence the name of the method. If, in addition, the loss functions are quadratic, our method can be interpreted as a novel variant of the recently introduced Iterative Hessian Sketch.
Year
Venue
Field
2015
ICML
Mathematical optimization,Empirical risk minimization,Quadratic equation,Separable space,Hessian matrix,Quadratic function,Duality (optimization),Mathematics,Maximization,Special case
DocType
Volume
Citations 
Journal
abs/1502.02268
21
PageRank 
References 
Authors
0.76
25
4
Name
Order
Citations
PageRank
Zheng Qu115127.47
Peter Richtárik2131484.53
Martin Takác375249.49
Olivier Fercoq416016.78