Abstract | ||
---|---|---|
The lasso and its variants have attracted much attention recently because of its ability of simultaneous estimation and variable selection. When some prior knowledge exists in applications, the performance of estimation and variable selection can be further improved by incorporating the prior knowledge as constraints on parameters. In this article, we consider linearly constrained generalized lasso, where the constraints are either linear inequalities or equalities or both. The dual of the problem is derived, which is a much simpler problem than the original one. As a by-product, a coordinate descent algorithm is feasible to solve the dual. A formula for the number of degrees of freedom is derived. The method for selecting tuning parameter is also discussed. |
Year | DOI | Venue |
---|---|---|
2015 | 10.1016/j.csda.2014.12.010 | Computational Statistics & Data Analysis |
Keywords | Field | DocType |
constrained optimization,degrees of freedom,coordinate descent,kkt condition,lasso,duality | Mathematical optimization,Feature selection,Lasso (statistics),Duality (optimization),Coordinate descent,Karush–Kuhn–Tucker conditions,Linear inequality,Mathematics,Constrained optimization | Journal |
Volume | Issue | ISSN |
86 | C | 0167-9473 |
Citations | PageRank | References |
1 | 0.38 | 3 |
Authors | ||
3 |