Boosting: Foundations and Algorithms
- 9h 36m
- Robert E. Schapire, Yoav Freund
- The MIT Press
- 2012
Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.
This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.
The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.
About the Authors
Robert E. Schapire is Professor of Computer Science at Princeton University. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.
Yoav Freund is Professor of Computer Science at the University of California, San Diego. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.
In this Book
-
Introduction and Overview
-
Foundations of Machine Learning
-
Using AdaBoost to Minimize Training Error
-
Direct Bounds on the Generalization Error
-
The Margins Explanation for Boosting's Effectiveness
-
Game Theory, Online Learning, and Boosting
-
Loss Minimization and Generalizations of Boosting
-
Boosting, Convex Optimization, and Information Geometry
-
Using Confidence-Rated Weak Predictions
-
Multiclass Classification Problems
-
Learning to Rank
-
Attaining the Best Possible Accuracy
-
Optimally Efficient Boosting
-
Boosting in Continuous Time