Maximum Correntropy Criterion-Based Robust Semisupervised Concept Factorization for Image Representation

  • Nan Zhou
  • , Badong Chen
  • , Yuanhua Du
  • , Tao Jiang
  • , Jun Liu
  • , Yangyang Xu

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

Concept factorization (CF) has shown its great advantage for both clustering and data representation and is particularly useful for image representation. Compared with nonnegative matrix factorization (NMF), CF can be applied to data containing negative values. However, the performance of CF method and its extensions will degenerate a lot due to the negative effects of outliers, and CF is an unsupervised method that cannot incorporate label information. In this article, we propose a novel CF method, with a novel model built based on the maximum correntropy criterion (MCC). In order to capture the local geometry information of data, our method integrates the robust adaptive embedding and CF into a unified framework. The label information is utilized in the adaptive learning process. Furthermore, an iterative strategy based on the accelerated block coordinate update is proposed. The convergence property of the proposed method is analyzed to ensure that the algorithm converges to a reliable solution. The experimental results on four real-world image data sets show that the new method can almost always filter out the negative effects of the outliers and outperform several state-of-the-art image representation methods.

Original languageEnglish
Article number8894670
Pages (from-to)3877-3891
Number of pages15
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume31
Issue number10
DOIs
StatePublished - Oct 2020

Keywords

  • Concept factorization (CF)
  • machine learning
  • maximum correntropy criterion (MCC)
  • nonnegative matrix factorization (NMF)
  • semisupervised learning

Fingerprint

Dive into the research topics of 'Maximum Correntropy Criterion-Based Robust Semisupervised Concept Factorization for Image Representation'. Together they form a unique fingerprint.

Cite this