跳到主要导航 跳到搜索 跳到主要内容

Relative gradient speeding up additive updates for nonnegative matrix factorization

科研成果: 期刊稿件文章同行评审

7 引用 (Scopus)

摘要

There exist two kinds of iterative updates for nonnegative matrix factorization: additive and multiplicative. The former does not take into consideration the characteristic of the parameter space of the constrained optimization while the latter holds the nonnegativity well. The relative gradient has better convergence rate than the ordinary gradient, and has been successfully used for neural learning, especially for blind source separation and independent component analysis. This paper applies the relative gradient to speed up the additive updates for nonnegative matrix factorization according to square Euclidean error. The primary experiments on synthetic and real datasets demonstrate the effectiveness of the proposed method.

源语言英语
页(从-至)493-499
页数7
期刊Neurocomputing
57
1-4
DOI
出版状态已出版 - 3月 2004

学术指纹

探究 'Relative gradient speeding up additive updates for nonnegative matrix factorization' 的科研主题。它们共同构成独一无二的指纹。

引用此