跳到主要导航 跳到搜索 跳到主要内容

Kernel gradient descent algorithm for information theoretic learning

科研成果: 期刊稿件文章同行评审

4 引用 (Scopus)

摘要

Information theoretic learning is a learning paradigm that uses concepts of entropies and divergences from information theory. A variety of signal processing and machine learning methods fall into this framework. Minimum error entropy principle is a typical one amongst them. In this paper, we study a kernel version of minimum error entropy methods that can be used to find nonlinear structures in the data. We show that the kernel minimum error entropy can be implemented by kernel based gradient descent algorithms with or without regularization. Convergence rates for both algorithms are deduced.

源语言英语
文章编号105518
期刊Journal of Approximation Theory
263
DOI
出版状态已出版 - 3月 2021
已对外发布

学术指纹

探究 'Kernel gradient descent algorithm for information theoretic learning' 的科研主题。它们共同构成独一无二的指纹。

引用此