Title
A SUBSPACE ACCELERATION METHOD FOR MINIMIZATION INVOLVING A GROUP SPARSITY-INDUCING REGULARIZER
Abstract
We consider the problem of minimizing an objective function that is the sum of a convex function and a group sparsity-inducing regularizer. Problems that integrate such regularizers arise in modern machine learning applications, often for the purpose of obtaining models that are easier to interpret and that have higher predictive accuracy. We present a new method for solving such problems that utilizes subspace acceleration, domain decomposition, and support identification. Our analysis provides the global iteration complexity of obtaining an \epsilon -accurate solution and shows that, under common assumptions, the iterates locally converge superlinearly. Numerical results on regularized logistic and linear regression problems show that our approach is efficient and reliable and outperforms state-of-the-art methods on interesting classes of problems, especially when the number of data points is larger than the number of features. For solving problems when the number of data points is smaller than the number of features, algorithms that focus on solving a dual problem may be more efficient than our approach, which solves the primal problem.
Year
DOI
Venue
2022
10.1137/21M1411111
SIAM JOURNAL ON OPTIMIZATION
Keywords
DocType
Volume
&nbsp, nonlinear optimization, convex optimization, worst-case iteration complexity, reg-, ularization, group regularizer, sparsity, logistic regression, linear regression, subspace acceleration
Journal
32
Issue
ISSN
Citations 
2
1052-6234
0
PageRank 
References 
Authors
0.34
0
3
Name
Order
Citations
PageRank
Frank E. Curtis143225.71
Yutong Dai200.34
Daniel P. Robinson326121.51