An empirical bias-variance analysis of DECORATE ensemble method at different training sample sizes

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

DECORATE (Diverse Ensemble Creation by Oppositional Relabeling of Artificial Training Examples) is a classifier combination technique to construct a set of diverse base classifiers using additional artificially generated training instances. The predictions from the base classifiers are then integrated into one by the mean combination rule. In order to gain more insight about its effectiveness and advantages, this paper utilizes a large experiment to study the bias-variance analysis of DECORATE as well as some other widely used ensemble methods (such as bagging, AdaBoost, random forest) at different training sample sizes. The experimental results yield the following conclusions. For small training sets, DECORATE has a dominant advantage over its rivals and its success is attributed to the larger bias reduction achieved by it than the other algorithms. With increase in training data, AdaBoost benefits most and the bias reduced by it gradually turns to be significant while its variance reduction is also medium. Thus, AdaBoost performs best with large training samples. Moreover, random forest behaves always second best regardless of small or large training sets and it is seen to mainly decrease variance while maintaining low bias. Bagging seems to be an intermediate one since it reduces variance primarily.

Original languageEnglish
Pages (from-to)829-850
Number of pages22
JournalJournal of Applied Statistics
Volume39
Issue number4
DOIs
StatePublished - Apr 2012

Keywords

  • AdaBoost
  • bias-variance decomposition
  • classifier combination method
  • random forest
  • training sample size

Fingerprint

Dive into the research topics of 'An empirical bias-variance analysis of DECORATE ensemble method at different training sample sizes'. Together they form a unique fingerprint.

Cite this