Title
Intrinsic Metric Learning With Subspace Representation
Abstract
The accuracy of classification and retrieval significantly depends on the metric used to compute the similarity between samples. For preserving the geometric structure, the symmetric positive definite (SPD) manifold is introduced into the metric learning problem. However, the SPD constraint is too strict to describe the real data distribution. In this paper, we extend the intrinsic metric learning problem to semi-definite case, by which the data distribution is better described for various classification tasks. First, we formulate the metric learning as a minimization problem to the SPD manifold on subspace, which not only considers to balance the information between inner classes and inter classes by an adaptive tradeoff parameter but also improves the robustness by the low-rank subspaces presentation. Thus, it benefits to design a structure-preserving algorithm on subspace by using the geodesic structure of the SPD subspace. To solve this model, we develop an iterative strategy to update the intrinsic metric and the subspace structure, respectively. Finally, we compare our proposed method with ten state-of-the-art methods on four data sets. The numerical results validate that our method can significantly improve the description of the data distribution, and hence, the performance of the image classification task.
Year
DOI
Venue
2019
10.1109/ACCESS.2019.2918149
IEEE ACCESS
Keywords
Field
DocType
Metric learning, subspace representation, low-rank optimization, structure preserving, image classification
Data set,Subspace topology,Computer science,Algorithm,Intrinsic metric,Robustness (computer science),Linear subspace,Contextual image classification,Manifold,Geodesic,Distributed computing
Journal
Volume
ISSN
Citations 
7
2169-3536
0
PageRank 
References 
Authors
0.34
0
5
Name
Order
Citations
PageRank
Lipeng Cai110.69
Shihui Ying223323.32
Yaxin Peng37316.82
Changzhou He400.34
Shaoyi Du535740.68