Intrinsic metric learning with subspace representation

  • Lipeng Cai
  • , Shihui Ying
  • , Yaxin Peng
  • , Changzhou He
  • , Shaoyi Du

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

The accuracy of classification and retrieval significantly depends on the metric used to compute the similarity between samples. For preserving the geometric structure, the symmetric positive definite (SPD) manifold is introduced into the metric learning problem. However, the SPD constraint is too strict to describe the real data distribution. In this paper, we extend the intrinsic metric learning problem to semi-definite case, by which the data distribution is better described for various classification tasks. First, we formulate the metric learning as a minimization problem to the SPD manifold on subspace, which not only considers to balance the information between inner classes and inter classes by an adaptive tradeoff parameter but also improves the robustness by the low-rank subspaces presentation. Thus, it benefits to design a structure-preserving algorithm on subspace by using the geodesic structure of the SPD subspace. To solve this model, we develop an iterative strategy to update the intrinsic metric and the subspace structure, respectively. Finally, we compare our proposed method with ten state-of-The-Art methods on four data sets. The numerical results validate that our method can significantly improve the description of the data distribution, and hence, the performance of the image classification task.

Original languageEnglish
Article number8719971
Pages (from-to)68572-68583
Number of pages12
JournalIEEE Access
Volume7
DOIs
StatePublished - 2019

Keywords

  • Metric learning
  • image classification
  • low-rank optimization
  • structure preserving
  • subspace representation

Fingerprint

Dive into the research topics of 'Intrinsic metric learning with subspace representation'. Together they form a unique fingerprint.

Cite this