SVM-Boosting based on Markov resampling: Theory and algorithm

  • Hongwei Jiang
  • , Bin Zou
  • , Chen Xu
  • , Jie Xu
  • , Yuan Yan Tang

Research output: Contribution to journalArticlepeer-review

24 Scopus citations

Abstract

In this article we introduce the idea of Markov resampling for Boosting methods. We first prove that Boosting algorithm with general convex loss function based on uniformly ergodic Markov chain (u.e.M.c.) examples is consistent and establish its fast convergence rate. We apply Boosting algorithm based on Markov resampling to Support Vector Machine (SVM), and introduce two new resampling-based Boosting algorithms: SVM-Boosting based on Markov resampling (SVM-BM) and improved SVM-Boosting based on Markov resampling (ISVM-BM). In contrast with SVM-BM, ISVM-BM uses the support vectors to calculate the weights of base classifiers. The numerical studies based on benchmark datasets show that the proposed two resampling-based SVM Boosting algorithms for linear base classifiers have smaller misclassification rates, less total time of sampling and training compared to three classical AdaBoost algorithms: Gentle AdaBoost, Real AdaBoost, Modest AdaBoost. In addition, we compare the proposed SVM-BM algorithm with the widely used and efficient gradient Boosting algorithm-XGBoost (eXtreme Gradient Boosting), SVM-AdaBoost and present some useful discussions on the technical parameters.

Original languageEnglish
Pages (from-to)276-290
Number of pages15
JournalNeural Networks
Volume131
DOIs
StatePublished - Nov 2020
Externally publishedYes

Keywords

  • Boosting
  • Consistency
  • Resampling
  • Uniformly ergodic Markov chain (u.e.M.c.)

Fingerprint

Dive into the research topics of 'SVM-Boosting based on Markov resampling: Theory and algorithm'. Together they form a unique fingerprint.

Cite this