Abstract
AdaBoost algorithms fuse weak classifiers to be a strong classifier by adaptively determine fusion weights of weak classifiers. In this paper, an enhanced AdaBoost algorithm by adjusting inner structure of weak classifiers (ISABoost) is proposed. In the traditional AdaBoost algorithms, the weak classifiers are not changed once they are trained. In ISABoost, the inner structures of weak classifiers are adjusted before their fusion weights determination. ISABoost inherits the advantages of the AdaBoost algorithms in fusing weak classifiers to be a strong classifier. ISABoost gives each weak classifier a second chance to be adjusted stronger. The adjusted weak classifiers are more contributive to make correct classifications for the hardest samples. To show the effectiveness of the proposed ISABoost algorithm, its applications in scene categorization are evaluated. Comparisons of ISABoost and AdaBoost algorithms on three widely utilized scene datasets show the effectiveness of ISABoost algorithm.
| Original language | English |
|---|---|
| Pages (from-to) | 104-113 |
| Number of pages | 10 |
| Journal | Neurocomputing |
| Volume | 103 |
| DOIs | |
| State | Published - 1 Mar 2013 |
Keywords
- AdaBoost
- Back-propagation networks
- Pattern classification
- SVM
- Scene categorization
- Weight learning
Fingerprint
Dive into the research topics of 'ISABoost: A weak classifier inner structure adjusting based AdaBoost algorithm-ISABoost based application in scene categorization'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver