Abstract | ||
---|---|---|
In high dimensional setting, componentwise L2 boosting method has been used to construct sparse model of high prediction, but it tends to select many ineffective variables. Several sparse boosting methods, such as, SparseL2 Boosting and Twin Boosting, have been proposed to improve the variable selection of L2boosting algorithm. In this paper, we propose a new general sparse boosting method (GSBoosting). The relations are established between GSBoosting and other well known regularized variable selection methods in orthogonal linear model, such as adaptive Lasso, hard thresholds etc. Simulations results show that GS-Boosting has good performance in both prediction and variable selection. © Springer-Verlag 2012. |
Year | DOI | Venue |
---|---|---|
2012 | 10.1007/978-3-642-35527-1_14 | ADMA |
Keywords | Field | DocType |
adaptive lasso,boosting algorithm,model selection,sparsity | Feature selection,Linear model,Computer science,Sparse model,Sparse approximation,Lasso (statistics),Algorithm,Model selection,Correlation,Boosting (machine learning),Artificial intelligence,Machine learning | Conference |
Volume | Issue | ISSN |
7713 LNAI | null | 16113349 |
Citations | PageRank | References |
0 | 0.34 | 2 |
Authors | ||
1 |
Name | Order | Citations | PageRank |
---|---|---|---|
Junlong Zhao | 1 | 0 | 2.37 |