摘要
There exist two kinds of iterative updates for nonnegative matrix factorization: additive and multiplicative. The former does not take into consideration the characteristic of the parameter space of the constrained optimization while the latter holds the nonnegativity well. The relative gradient has better convergence rate than the ordinary gradient, and has been successfully used for neural learning, especially for blind source separation and independent component analysis. This paper applies the relative gradient to speed up the additive updates for nonnegative matrix factorization according to square Euclidean error. The primary experiments on synthetic and real datasets demonstrate the effectiveness of the proposed method.
| 源语言 | 英语 |
|---|---|
| 页(从-至) | 493-499 |
| 页数 | 7 |
| 期刊 | Neurocomputing |
| 卷 | 57 |
| 期 | 1-4 |
| DOI | |
| 出版状态 | 已出版 - 3月 2004 |
学术指纹
探究 'Relative gradient speeding up additive updates for nonnegative matrix factorization' 的科研主题。它们共同构成独一无二的指纹。引用此
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver