Robust subspace clustering via penalized mixture of Gaussians

Research output: Contribution to journalArticlepeer-review

25 Scopus citations

Abstract

Many problems in computer vision and pattern recognition can be posed as learning low-dimensional subspace structures from high-dimensional data. Subspace clustering represents a commonly utilized subspace learning strategy. The existing subspace clustering models mainly adopt a deterministic loss function to describe a certain noise type between an observed data matrix and its self-expressed form. However, the noises embedded in practical high-dimensional data are generally non-Gaussian and have much more complex structures. To address this issue, this paper proposes a robust subspace clustering model by embedding the Mixture of Gaussians (MoG) noise modeling strategy into the low-rank representation (LRR) subspace clustering model. The proposed MoG-LRR model is capitalized on its adapting to a wider range of noise distributions beyond current methods due to the universal approximation capability of MoG. Additionally, a penalized likelihood method is encoded into this model to facilitate selecting the number of mixture components automatically. A modified Expectation Maximization (EM) algorithm is also designed to infer the parameters involved in the proposed PMoG-LRR model. The superiority of our method is demonstrated by extensive experiments on face clustering and motion segmentation datasets.

Original languageEnglish
Pages (from-to)4-11
Number of pages8
JournalNeurocomputing
Volume278
DOIs
StatePublished - 22 Feb 2018

Keywords

  • Expectation maximization
  • Low-rank representation
  • Mixture of Gaussians
  • Subspace clustering

Fingerprint

Dive into the research topics of 'Robust subspace clustering via penalized mixture of Gaussians'. Together they form a unique fingerprint.

Cite this