Abstract | ||
---|---|---|
Logistic regression is one of the most commonly applied statistical methods for binary classification problems. This paper considers the nonnegative garrote regularization penalty in logistic models and derives an optimization algorithm for minimizing the resultant penalty function. The search algorithm is computationally efficient and can be used even when the number of regressors is much larger than the number of samples. As the nonnegative garrote requires an initial estimate of the parameters, a number of possible estimators are compared and contrasted. Logistic regression with the nonnegative garrote is then compared with several popular regularization methods in a set of comprehensive numerical simulations. The proposed method attained excellent performance in terms of prediction rate and variable selection accuracy on both real and artificially generated data. |
Year | DOI | Venue |
---|---|---|
2011 | 10.1007/978-3-642-25832-9_9 | Australasian Conference on Artificial Intelligence |
Keywords | Field | DocType |
popular regularization method,optimization algorithm,resultant penalty function,search algorithm,logistic model,binary classification problem,logistic regression,nonnegative garrote regularization penalty,nonnegative garrote,comprehensive numerical simulation | Mathematical optimization,Search algorithm,Feature selection,Binary classification,Regularization (mathematics),Optimization algorithm,Logistic regression,Mathematics,Estimator,Penalty method | Conference |
Volume | ISSN | Citations |
7106 | 0302-9743 | 0 |
PageRank | References | Authors |
0.34 | 2 | 2 |
Name | Order | Citations | PageRank |
---|---|---|---|
Enes Makalic | 1 | 55 | 11.54 |
Daniel F. Schmidt | 2 | 51 | 10.68 |