Abstract
There exist two kinds of iterative updates for nonnegative matrix factorization: additive and multiplicative. The former does not take into consideration the characteristic of the parameter space of the constrained optimization while the latter holds the nonnegativity well. The relative gradient has better convergence rate than the ordinary gradient, and has been successfully used for neural learning, especially for blind source separation and independent component analysis. This paper applies the relative gradient to speed up the additive updates for nonnegative matrix factorization according to square Euclidean error. The primary experiments on synthetic and real datasets demonstrate the effectiveness of the proposed method.
| Original language | English |
|---|---|
| Pages (from-to) | 493-499 |
| Number of pages | 7 |
| Journal | Neurocomputing |
| Volume | 57 |
| Issue number | 1-4 |
| DOIs | |
| State | Published - Mar 2004 |
Keywords
- Additive updates
- Multiplicative updates
- Nonnegative matrix factorization
- Relative gradient
Fingerprint
Dive into the research topics of 'Relative gradient speeding up additive updates for nonnegative matrix factorization'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver