Title
Convex Perturbations for Scalable Semidefinite Programming
Abstract
Many important machine learning problems are modeled and solved via semidefinite programs; examples include metric learning, nonlinear em- bedding, and certain clustering problems. Of- ten, off-the-shelf software is invoked for the as- sociated optimization, which can be inappropri- ate due to excessive computational and storage requirements. In this paper, we introduce the use of convex perturbations for solving semidefinite programs (SDPs), and for a specific perturbation we derive an algorithm that has several advan- tages over existing techniques: a) it is simple, re- quiring only a few lines of MATLAB, b) it is a first-order method, and thereby scalable, and c) it can easily exploit the structure of a given SDP (e.g., when the constraint matrices are low-rank, a situation common to several machine learning SDPs). A pleasant byproduct of our method is a fast, kernelized version of the large-margin near- est neighbor metric learning algorithm (Wein- berger et al., 2005). We demonstrate that our algorithm is effective in finding fast approxima- tions to large-scale SDPs arising in some ma- chine learning applications.
Year
Venue
Keywords
2009
AISTATS
first order,machine learning
Field
DocType
Volume
Active learning (machine learning),Computer science,Wake-sleep algorithm,Theoretical computer science,Artificial intelligence,Large margin nearest neighbor,Cluster analysis,Online machine learning,Mathematical optimization,Stability (learning theory),Semidefinite embedding,Semidefinite programming,Machine learning
Journal
5
Citations 
PageRank 
References 
7
0.59
19
Authors
4
Name
Order
Citations
PageRank
Brian Kulis14700201.68
suvrit sra2141.20
Inderjit S. Dhillon37601450.15
d van dyk m welling4111.32