Title
Generalized Linear Model Regression under Distance-to-set Penalties.
Abstract
Estimation in generalized linear models (GLM) is complicated by the presence of constraints. One can handle constraints by maximizing a penalized log-likelihood. Penalties such as the lasso are effective in high dimensions, but often lead to unwanted shrinkage. This paper explores instead penalizing the squared distance to constraint sets. Distance penalties are more flexible than algebraic and regularization penalties, and avoid the drawback of shrinkage. To optimize distance penalized objectives, we make use of the majorization-minimization principle. Resulting algorithms constructed within this framework are amenable to acceleration and come with global convergence guarantees. Applications to shape constraints, sparse regression, and rank-restricted matrix regression on synthetic and real data showcase strong empirical performance, even under non-convex constraints.
Year
Venue
Field
2017
ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017)
Convergence (routing),Mathematical optimization,Square (algebra),Regression,Matrix (mathematics),Lasso (statistics),Generalized linear model,Regularization (mathematics),Acceleration,Mathematics
DocType
Volume
ISSN
Conference
30
1049-5258
Citations 
PageRank 
References 
1
0.35
6
Authors
3
Name
Order
Citations
PageRank
Xu, Jason191.63
Eric C. Chi211.03
Kenneth Lange331043.08