Title
Isotonic Modeling with Non-differentiable Loss Functions with Application to Lasso Regularization
Abstract
In this paper we present a novel algorithmic approach for fitting isotonic models under convex, yet non-differentiable, loss functions. It is a generalization of the greedy non-regret approach proposed by Luss and Rosset (2012) for differentiable loss functions, taking into account the sub-gradiental extensions required. We prove that our suggested algorithm solves the isotonic modeling problem while maintaining favorable computational and statistical properties. As our suggested algorithm may be used for any non-differentiable loss function, we focus our interest on isotonic modeling for either regression or two-class classification with appropriate log-likelihood loss and lasso penalty on the fitted values. This combination allows us to maintain the non-parametric nature of isotonic modeling, while controlling model complexity through regularization. We demonstrate the efficiency and usefulness of our approach on both synthetic and real world data.
Year
DOI
Venue
2016
10.1109/TPAMI.2015.2441063
IEEE Transactions on Pattern Analysis and Machine Intelligence
Keywords
Field
DocType
girp,convex optimization,isotonic regression,nonparametric regression,regularization path
Data modeling,Mathematical optimization,Algorithm design,Computer science,Nonparametric regression,Lasso (statistics),Isotonic regression,Regularization (mathematics),Differentiable function,Convex optimization
Journal
Volume
Issue
ISSN
PP
99
0162-8828
Citations 
PageRank 
References 
1
0.35
7
Authors
2
Name
Order
Citations
PageRank
Amichai Painsky1188.11
Saharon Rosset21087105.33