Title | ||
---|---|---|
Comparative study of computational algorithms for the Lasso with high-dimensional, highly correlated data |
Abstract | ||
---|---|---|
Variable selection is important in high-dimensional data analysis. The Lasso regression is useful since it possesses sparsity, soft-decision rule, and computational efficiency. However, since the Lasso penalized likelihood contains a nondifferentiable term, standard optimization tools cannot be applied. Many computation algorithms to optimize this Lasso penalized likelihood function in high-dimensional settings have been proposed. To name a few, coordinate descent (CD) algorithm, majorization-minimization using local quadratic approximation, fast iterative shrinkage thresholding algorithm (FISTA) and alternating direction method of multipliers (ADMM). In this paper, we undertake a comparative study that analyzes relative merits of these algorithms. We are especially concerned with numerical sensitivity to the correlation between the covariates. We conduct a simulation study considering factors that affect the condition number of covariance matrix of the covariates, as well as the level of penalization. We apply the algorithms to cancer biomarker discovery, and compare convergence speed and stability. |
Year | DOI | Venue |
---|---|---|
2018 | https://doi.org/10.1007/s10489-016-0850-7 | Appl. Intell. |
Keywords | Field | DocType |
Lasso,Majorization-minimization,Coordinate descent,ADMM,FISTA | Convergence (routing),Feature selection,Computer science,Lasso (statistics),Artificial intelligence,Coordinate descent,Condition number,Mathematical optimization,Covariate,Likelihood function,Algorithm,Covariance matrix,Machine learning | Journal |
Volume | Issue | ISSN |
48 | 8 | 0924-669X |
Citations | PageRank | References |
1 | 0.35 | 4 |
Authors | ||
3 |
Name | Order | Citations | PageRank |
---|---|---|---|
Baekjin Kim | 1 | 1 | 1.03 |
Donghyeon Yu | 2 | 3 | 2.10 |
Joong-Ho Won | 3 | 13 | 5.56 |