Skip to main navigation Skip to search Skip to main content

Relative gradient speeding up additive updates for nonnegative matrix factorization

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

There exist two kinds of iterative updates for nonnegative matrix factorization: additive and multiplicative. The former does not take into consideration the characteristic of the parameter space of the constrained optimization while the latter holds the nonnegativity well. The relative gradient has better convergence rate than the ordinary gradient, and has been successfully used for neural learning, especially for blind source separation and independent component analysis. This paper applies the relative gradient to speed up the additive updates for nonnegative matrix factorization according to square Euclidean error. The primary experiments on synthetic and real datasets demonstrate the effectiveness of the proposed method.

Original languageEnglish
Pages (from-to)493-499
Number of pages7
JournalNeurocomputing
Volume57
Issue number1-4
DOIs
StatePublished - Mar 2004

Keywords

  • Additive updates
  • Multiplicative updates
  • Nonnegative matrix factorization
  • Relative gradient

Fingerprint

Dive into the research topics of 'Relative gradient speeding up additive updates for nonnegative matrix factorization'. Together they form a unique fingerprint.

Cite this