Differential evolutionary Bayesian classifier

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Naïve Bayes (NB) based on the attribute independence assumption has been widely applied in many domains for its simplicity and efficiency. However, the independence assumption is often violated in many real-world applications. In response to this problem, a mount of research has been carried out to improve NB's accuracy by mitigating the attribute independence assumption, for example Lazy learning of Bayesian Rules(LBR), Tree Augmented Naive Bayes (TAN) and Averaged One-Dependence Estimator(AODE). AODE which averages all Super Parent One-dependence Estimators (SPODE) has attracted widely attention for its outstanding performance. Because of the different role of every SPODEs, the performance will be expected to be improved significantly if different weights are assigned to these SPODEs. We proposed the framework of linear weighted SPODE ensemble and efficient learning strategy of weights based on differential evolution. The experience has shown that the proposed algorithm can generate better performance in most case than NB, AODE, WAODE, TAN and LBR.

Original languageEnglish
Title of host publication2008 IEEE International Conference on Granular Computing, GRC 2008
Pages191-195
Number of pages5
DOIs
StatePublished - 2008
Event2008 IEEE International Conference on Granular Computing, GRC 2008 - Hangzhou, China
Duration: 26 Aug 200828 Aug 2008

Publication series

Name2008 IEEE International Conference on Granular Computing, GRC 2008

Conference

Conference2008 IEEE International Conference on Granular Computing, GRC 2008
Country/TerritoryChina
CityHangzhou
Period26/08/0828/08/08

Keywords

  • AODE
  • Classifier
  • Differential evolutionary
  • Generic algorithm
  • Naïve Bayes

Fingerprint

Dive into the research topics of 'Differential evolutionary Bayesian classifier'. Together they form a unique fingerprint.

Cite this