Title
Huber-Norm Regularization for Linear Prediction Models.
Abstract
In order to avoid overfitting, it is common practice to regularize linear prediction models using squared or absolute-value norms of the model parameters. In our article we consider a new method of regularization: Huber-norm regularization imposes a combination of $$\\ell _{1}$$ and $$\\ell _{2}$$-norm regularization on the model parameters. We derive the dual optimization problem, prove an upper bound on the statistical risk of the model class by means of the Rademacher complexity and establish a simple type of oracle inequality on the optimality of the decision rule. Empirically, we observe that logistic regression with Huber-norm regularizer outperforms $$\\ell _{1}$$-norm, $$\\ell _{2}$$-norm, and elastic-net regularization for a wide range of benchmark data sets.
Year
DOI
Venue
2016
10.1007/978-3-319-46128-1_45
ECML/PKDD
Field
DocType
Citations 
Decision rule,Applied mathematics,Mathematical optimization,Square (algebra),Upper and lower bounds,Rademacher complexity,Linear prediction,Regularization (mathematics),Overfitting,Optimization problem,Mathematics
Conference
2
PageRank 
References 
Authors
0.39
7
5
Name
Order
Citations
PageRank
Oleksandr Zadorozhnyi120.73
Gunthard Benecke220.39
Mandt, Stephan312819.55
Tobias Scheffer41862139.64
Marius Kloft540235.48