Title
A First Order Free Lunch for SQRT-Lasso.
Abstract
Many statistical machine learning techniques sacrifice convenient computational structures to gain estimation robustness and modeling flexibility. In this paper, we study this fundamental tradeoff through a SQRT-Lasso problem for sparse linear regression and sparse precision matrix estimation in high dimensions. We explain how novel optimization techniques help address these computational challenges. Namely, we propose a pathwise iterative smoothing shrinkage thresholding algorithm for solving the SQRT-Lasso optimization problem, and provide a novel model-based perspective for analyzing the smoothing optimization framework, which allows us to establish a near linear convergence (R-linear convergence) guarantee for our proposed algorithm, without sacrificing statistical accuracy. This implies that solving the SQRT-Lasso optimization problem is almost as easy as solving the Lasso optimization problem, while the former requires much less parameter tuning effort. Moreover, we show that our proposed algorithm can also be applied to sparse precision matrix estimation, and enjoys desirable computational as well as statistical properties. Numerical experiments are provided to support our theory.
Year
Venue
Field
2016
arXiv: Learning
Convergence (routing),Mathematical optimization,First order,Lasso (statistics),Robustness (computer science),Smoothing,Rate of convergence,Artificial intelligence,Optimization problem,Mathematics,Machine learning,Linear regression
DocType
Volume
Citations 
Journal
abs/1605.07950
1
PageRank 
References 
Authors
0.36
10
6
Name
Order
Citations
PageRank
Xingguo Li19619.95
Jarvis D. Haupt2113.23
R. Arora348935.97
Han Liu443442.70
Mingyi Hong5153391.29
Tuo Zhao622240.58