A two-stage minimax concave penalty based method in pruned AdaBoost ensemble

  • He Jiang
  • , Weihua Zheng
  • , Liangqing Luo
  • , Yao Dong

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

AdaBoost is a highly effective ensemble learning method that combines several weak learners to produce a strong committee with higher accuracy. However, similar to other ensemble methods, AdaBoost uses a large number of base learners to produce the final outcome while addressing high-dimensional data. Thus, it poses a critical challenge in the form of high memory-space consumption. Feature selection methods can significantly reduce dimensionality in regression and have been established to be applicable in ensemble pruning. By pruning the ensemble, it is possible to generate a simpler ensemble with fewer base learners but a higher accuracy. In this article, we propose the minimax concave penalty (MCP) function to prune an AdaBoost ensemble to simplify the model and improve its accuracy simultaneously. The MCP penalty function is compared with LASSO and SCAD in terms of performance in pruning the ensemble. Experiments performed on real datasets demonstrate that MCP-pruning outperforms the other two methods. It can reduce the ensemble size effectively, and generate marginally more accurate predictions than the unpruned AdaBoost model.

Original languageEnglish
Article number105674
JournalApplied Soft Computing Journal
Volume83
DOIs
StatePublished - Oct 2019
Externally publishedYes

Keywords

  • AdaBoost
  • Ensemble pruning
  • Feature selection
  • Minimax concave penalty

Fingerprint

Dive into the research topics of 'A two-stage minimax concave penalty based method in pruned AdaBoost ensemble'. Together they form a unique fingerprint.

Cite this