A local boosting algorithm for solving classification problems

Research output: Contribution to journalArticlepeer-review

52 Scopus citations

Abstract

Based on the boosting-by-resampling version of Adaboost, a local boosting algorithm for dealing with classification tasks is proposed in this paper. Its main idea is that in each iteration, a local error is calculated for every training instance and a function of this local error is utilized to update the probability that the instance is selected to be part of next classifier's training set. When classifying a novel instance, the similarity information between it and each training instance is taken into account. Meanwhile, a parameter is introduced into the process of updating the probabilities assigned to training instances so that the algorithm can be more accurate than Adaboost. The experimental results on synthetic and several benchmark real-world data sets available from the UCI repository show that the proposed method improves the prediction accuracy and the robustness to classification noise of Adaboost. Furthermore, the diversity-accuracy patterns of the ensemble classifiers are investigated by kappa-error diagrams.

Original languageEnglish
Pages (from-to)1928-1941
Number of pages14
JournalComputational Statistics and Data Analysis
Volume52
Issue number4
DOIs
StatePublished - 10 Jan 2008

Keywords

  • Adaboost
  • Classification noise
  • Kappa-error diagram
  • Local boosting
  • Weak learning algorithm

Fingerprint

Dive into the research topics of 'A local boosting algorithm for solving classification problems'. Together they form a unique fingerprint.

Cite this