Title
Metric learning with rank and sparsity constraints
Abstract
Choosing a distance preserving measure or metric is fundamental to many signal processing algorithms, such as k-means, nearest neighbor searches, hashing, and compressive sensing. In virtually all these applications, the efficiency of the signal processing algorithm depends on how fast we can evaluate the learned metric. Moreover, storing the chosen metric can create space bottlenecks in high dimensional signal processing problems. As a result, we consider data dependent metric learning with rank as well as sparsity constraints. We propose a new non-convex algorithm and empirically demonstrate its performance on various datasets; a side benefit is that it is also much faster than existing approaches. The added sparsity constraints significantly improve the speed of multiplying with the learned metrics without sacrificing their quality.
Year
DOI
Venue
2014
10.1109/ICASSP.2014.6853550
ICASSP
Keywords
Field
DocType
metric learning,k-means algorithms,data dependent metric learning,compressive sensing,learning (artificial intelligence),hashing algorithms,distance preserving metric,signal processing algorithms,nearest neighbor searches,compressed sensing,concave programming,proximal gradient methods,gradient methods,sparsity,high dimensional signal processing problems,sparsity constraints,nesterov acceleration,low-rank,nonconvex algorithm,rank constraints,acceleration,sparse matrices,measurement,learning artificial intelligence,principal component analysis,vectors
k-nearest neighbors algorithm,Signal processing,Mathematical optimization,Pattern recognition,Computer science,Data dependent,Artificial intelligence,Hash function,Compressed sensing,Signal processing algorithms
Conference
ISSN
Citations 
PageRank 
1520-6149
4
0.44
References 
Authors
15
4
Name
Order
Citations
PageRank
Bubacarr Bah1766.25
Stephen R. Becker243627.61
Volkan Cevher31860141.56
Baran Gozcu4121.57