TY - JOUR
T1 - Intrinsic metric learning with subspace representation
AU - Cai, Lipeng
AU - Ying, Shihui
AU - Peng, Yaxin
AU - He, Changzhou
AU - Du, Shaoyi
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2019
Y1 - 2019
N2 - The accuracy of classification and retrieval significantly depends on the metric used to compute the similarity between samples. For preserving the geometric structure, the symmetric positive definite (SPD) manifold is introduced into the metric learning problem. However, the SPD constraint is too strict to describe the real data distribution. In this paper, we extend the intrinsic metric learning problem to semi-definite case, by which the data distribution is better described for various classification tasks. First, we formulate the metric learning as a minimization problem to the SPD manifold on subspace, which not only considers to balance the information between inner classes and inter classes by an adaptive tradeoff parameter but also improves the robustness by the low-rank subspaces presentation. Thus, it benefits to design a structure-preserving algorithm on subspace by using the geodesic structure of the SPD subspace. To solve this model, we develop an iterative strategy to update the intrinsic metric and the subspace structure, respectively. Finally, we compare our proposed method with ten state-of-The-Art methods on four data sets. The numerical results validate that our method can significantly improve the description of the data distribution, and hence, the performance of the image classification task.
AB - The accuracy of classification and retrieval significantly depends on the metric used to compute the similarity between samples. For preserving the geometric structure, the symmetric positive definite (SPD) manifold is introduced into the metric learning problem. However, the SPD constraint is too strict to describe the real data distribution. In this paper, we extend the intrinsic metric learning problem to semi-definite case, by which the data distribution is better described for various classification tasks. First, we formulate the metric learning as a minimization problem to the SPD manifold on subspace, which not only considers to balance the information between inner classes and inter classes by an adaptive tradeoff parameter but also improves the robustness by the low-rank subspaces presentation. Thus, it benefits to design a structure-preserving algorithm on subspace by using the geodesic structure of the SPD subspace. To solve this model, we develop an iterative strategy to update the intrinsic metric and the subspace structure, respectively. Finally, we compare our proposed method with ten state-of-The-Art methods on four data sets. The numerical results validate that our method can significantly improve the description of the data distribution, and hence, the performance of the image classification task.
KW - Metric learning
KW - image classification
KW - low-rank optimization
KW - structure preserving
KW - subspace representation
UR - https://www.scopus.com/pages/publications/85067280647
U2 - 10.1109/ACCESS.2019.2918149
DO - 10.1109/ACCESS.2019.2918149
M3 - 文章
AN - SCOPUS:85067280647
SN - 2169-3536
VL - 7
SP - 68572
EP - 68583
JO - IEEE Access
JF - IEEE Access
M1 - 8719971
ER -