Title
Scalable Algorithms For The Sparse Ridge Regression
Abstract
Sparse regression and variable selection for large-scale data have been rapidly developed in the past decades. This work focuses on sparse ridge regression, which enforces the sparsity by use of the L-0 norm. We first prove that the continuous relaxation of the mixed integer second order conic (MISOC) reformulation using perspective formulation is equivalent to that of the convex integer formulation proposed in recent work. We also show that the convex hull of the constraint system of the MISOC formulation is equal to its continuous relaxation. Based upon these two formulations (i.e., the MISOC formulation and convex integer formulation), we analyze two scalable algorithms, the greedy and randomized algorithms, for sparse ridge regression with desirable theoretical properties. The proposed algorithms are proved to yield near-optimal solutions under mild conditions. We further propose integrating the greedy algorithm with the randomized algorithm, which can greedily search the features from the nonzero subset identified by the continuous relaxation of the MISOC formulation. The merits of the proposed methods are illustrated through numerical examples in comparison with several existing ones.
Year
DOI
Venue
2020
10.1137/19M1245414
SIAM JOURNAL ON OPTIMIZATION
Keywords
DocType
Volume
approximation algorithm, chance constraint, conic program, mixed integer, ridge regression
Journal
30
Issue
ISSN
Citations 
4
1052-6234
1
PageRank 
References 
Authors
0.35
0
2
Name
Order
Citations
PageRank
Weijun Xie1316.74
Xinwei Deng232.41