Journal of the Japanese Society for Artificial Intelligence
Online ISSN : 2435-8614
Print ISSN : 2188-2266
Print ISSN:0912-8085 until 2013
An Asymptotic Analysis and Improvement of AdaBoost in the Binary Classification Case
Takashi ONODAGunnar RATSCHKlaus R. MULLER
Author information
MAGAZINE FREE ACCESS

2000 Volume 15 Issue 2 Pages 287-296

Details
Abstract

Recent work has shown that combining multiple versions of weak classifiers such as decision trees or neural networks results in reduced test set error. However, the analysis and the theory in reducing generalization error has not been well understood. To study this in greater detail, we analyze the asymptotic behavior of AdaBoost type algorithms. The theoretical analysis establishes the relation between the distribution of margins of the training examples and the generated voting classification rule. This paper shows asymptotic experimental results for the binary classification case underlining the theoretical findings. Finally, we point out that AdaBoost type algorithms lead to overfitting and we improve AdaBoost by in-troducing regularization to avoid overfitting and to thereby reduce the generalization error. Also we show in numerical experiments that our improvement can lead to supperior classification results.

Content from these authors
© 2000 The Japaense Society for Artificial Intelligence
Previous article Next article
feedback
Top