Title
A quasi-Newton approach to non-smooth convex optimization
Abstract
We extend the well-known BFGS quasi-Newton method and its limited-memory variant LBFGS to the optimization of non-smooth convex objectives. This is done in a rigorous fashion by generalizing three components of BFGS to subdifferentials: The local quadratic model, the identification of a descent direction, and the Wolfe line search conditions. We apply the resulting subLBFGS algorithm to L2-regularized risk minimization with binary hinge loss, and its direction-finding component to L1-regularized risk minimization with logistic loss. In both settings our generic algorithms perform comparable to or better than their counterparts in specialized state-of-the-art solvers.
Year
DOI
Venue
2008
10.1145/1390156.1390309
ICML
Keywords
Field
DocType
descent direction,local quadratic model,quasi-newton approach,convex optimization,l1-regularized risk minimization,direction-finding component,logistic loss,well-known bfgs quasi-newton method,l2-regularized risk minimization,limited-memory variant lbfgs,wolfe line search condition,generic algorithm,line search,quasi newton method
Mathematical optimization,Hinge loss,Computer science,Descent direction,Regular polygon,Line search,Minification,Broyden–Fletcher–Goldfarb–Shanno algorithm,Convex optimization,Wolfe conditions
Conference
Citations 
PageRank 
References 
14
1.29
10
Authors
4
Name
Order
Citations
PageRank
Jin Yu1141.29
S. V. N. Vishwanathan21991131.90
Simon Günter358834.93
Nicol N. Schraudolph41185164.26