Robert Schapire

Digital Library

ACM Paris Kanellakis Theory and Practice Award

USA - 2004

citation

Theory and practice of boosting

Yoav Freund, Robert Schapire

For the development of the theory and practice of boosting and its applications to machine learning.

This award recognizes the seminal work and distinguished contributions of Yoav Freund and Robert Schapire to the development of the theory and practice of boosting, a general and provably effective method of producing arbitrarily accurate prediction rules by combining weak learning rules. That this was possible at all, let alone in polynomial time, was not known until Schapire's breakthrough 1990 paper. A year later, Freund developed a more efficient boosting algorithm, and subsequent work culminated in Freund and Schapire's AdaBoost (Adaptive Boosting) Algorithm. AdaBoost represents a significant contribution of computer science to statistics. AdaBoost can be used to significantly reduce the error of algorithms used in statistical analysis, spam filtering, fraud detection, optical character recognition, and market segmentation, among other applications. Its elegance, wide applicability, simplicity of implementation and great success in practice have transformed boosting into one of the pillars of machine learning, broadly impacting various scientific communities and industry. The algorithm is already widely used, and still growing in its relevance and importance to the practice of machine learning.