TY - JOUR
T1 - Adaptive filtering under minimum information divergence criterion
AU - Chen, Badong
AU - Zhu, Yu
AU - Hu, Jinchun
AU - Sun, Zengqi
PY - 2009/4
Y1 - 2009/4
N2 - Traditional filtering theory is always based on optimization of the expected value of a suitably chosen function of error, such as the minimum mean-square error (MMSE) criterion, the minimum error entropy (MEE) criterion, and so on. None of those criteria could capture all the probabilistic information about the error distribution. In this work, we propose a novel approach to shape the probability density function (PDF) of the errors in adaptive filtering. As the PDF contains all the probabilistic information, the proposed approach can be used to obtain the desired variance or entropy, and is expected to be useful in the complex signal processing and learning systems. In our method, the information divergence between the actual errors and the desired errors is chosen as the cost function, which is estimated by kernel approach. Some important properties of the estimated divergence are presented. Also, for the finite impulse response (FIR) filter, a stochastic gradient algorithm is derived. Finally, simulation examples illustrate the effectiveness of this algorithm in adaptive system training.
AB - Traditional filtering theory is always based on optimization of the expected value of a suitably chosen function of error, such as the minimum mean-square error (MMSE) criterion, the minimum error entropy (MEE) criterion, and so on. None of those criteria could capture all the probabilistic information about the error distribution. In this work, we propose a novel approach to shape the probability density function (PDF) of the errors in adaptive filtering. As the PDF contains all the probabilistic information, the proposed approach can be used to obtain the desired variance or entropy, and is expected to be useful in the complex signal processing and learning systems. In our method, the information divergence between the actual errors and the desired errors is chosen as the cost function, which is estimated by kernel approach. Some important properties of the estimated divergence are presented. Also, for the finite impulse response (FIR) filter, a stochastic gradient algorithm is derived. Finally, simulation examples illustrate the effectiveness of this algorithm in adaptive system training.
KW - Adaptive filtering
KW - Information divergence
KW - Kernel method
KW - Stochastic gradient algorithm
UR - https://www.scopus.com/pages/publications/64849100387
U2 - 10.1007/s12555-009-0201-0
DO - 10.1007/s12555-009-0201-0
M3 - 文章
AN - SCOPUS:64849100387
SN - 1598-6446
VL - 7
SP - 157
EP - 164
JO - International Journal of Control, Automation and Systems
JF - International Journal of Control, Automation and Systems
IS - 2
ER -