Title
FaRSA for ℓ1-regularized convex optimization: local convergence and numerical experience.
Abstract
FaRSA is a new method for minimizing the sum of a differentiable convex function and an -norm regularizer. The main features of the method include: (i) an evolving set of indices corresponding to variables that are predicted to be nonzero at a solution; (ii) subproblems that only need to be solved in a reduced space to lessen per-iteration computational costs; (iii) conditions that determine how accurately each subproblem must be solved, which allow conjugate gradient or coordinate descent techniques to be employed; (iv) a computationally practical condition that dictates when the subspace explored by the current subproblem should be updated; and (v) a reduced proximal gradient step that ensures a sufficient decrease in the objective function when it is decided that the index set that holds the nonzero variables should be expanded. We proved global convergence of the method and demonstrated its performance on a set of model prediction problems with a Matlab implementation. Here, we introduce an enhanced s...
Year
Venue
Field
2018
Optimization Methods and Software
Conjugate gradient method,Mathematical optimization,Index set,Nonlinear programming,Differentiable function,Convex function,Local convergence,Coordinate descent,Convex optimization,Mathematics
DocType
Volume
Issue
Journal
33
2
Citations 
PageRank 
References 
0
0.34
7
Authors
3
Name
Order
Citations
PageRank
Tianyi Chen19414.50
Frank E. Curtis243225.71
Daniel P. Robinson326121.51