Quantized minimum error entropy with fiducial points for robust regression

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

Minimum error entropy with fiducial points (MEEF) has received a lot of attention, due to its outstanding performance to curb the negative influence caused by non-Gaussian noises in the fields of machine learning and signal processing. However, the estimate of the information potential of MEEF involves a double summation operator based on all available error samples, which can result in large computational burden in many practical scenarios. In this paper, an efficient quantization method is therefore adopted to represent the primary set of error samples with a smaller subset, generating a quantized MEEF (QMEEF). Some basic properties of QMEEF are presented and proved from theoretical perspectives. In addition, we have applied this new criterion to train a class of linear-in-parameters models, including the commonly used linear regression model, random vector functional link network, and broad learning system as special cases. Experimental results on various datasets are reported to demonstrate the desirable performance of the proposed methods to perform regression tasks with contaminated data.

Original languageEnglish
Pages (from-to)405-418
Number of pages14
JournalNeural Networks
Volume168
DOIs
StatePublished - Nov 2023

Keywords

  • Broad learning system
  • Minimum error entropy with fiducial points
  • Quantized method
  • Random vector functional link network
  • Robust regression

Fingerprint

Dive into the research topics of 'Quantized minimum error entropy with fiducial points for robust regression'. Together they form a unique fingerprint.

Cite this