Book Name: | Boosting: Foundations and Algorithms |
Category: | Algorithms |
Free Download: | Available |
Boosting: Foundations and Algorithms
Book Description
Boosting is a machine learning approach based on the idea of creating a very accurate predictor by combining many weak and imprecise “rules of thumb”. A considerable richness theory has developed around promotion, with links to a wide range of topics including statistics, game theory, convex optimization, and information geometry. Stimulation algorithms have also been successful in practice in areas such as biology, vision, and speech processing. At various points in its history, augmentation was considered mysterious, controversial, even paradoxical.
This book, written by the inventors of the method, assembles, organizes, simplifies, and dramatically expands two decades of advancing research, presenting both theory and application in a way that is possible. accessible to readers from different backgrounds while providing authoritative references. Reference material for advanced researchers. With an introduction to all the material and the inclusion of exercises in each chapter, the book is also suitable for classroom use.
The book begins with a general introduction to machine learning algorithms and their analysis; then explores the central theory of motivation, in particular its generalizability; consider some other theoretical perspectives that help explain and understand reinforcement; provide augmented reality extensions to more complex learning problems; and finally present some advanced theoretical topics. Many practical applications and illustrations are provided throughout.
About Author
Robert E. Schapire is a professor of computer science at Princeton University. Yoav Freund is a professor of computer science at the University of California, San Diego. For their work in promotion, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Prize in Theory and Practice in 2004.
Table of contents:
Foundations of machine learning
Using AdaBoost to minimize training error
Direct bounds on the generalization error
The margins explanation for boosting’s effectiveness
Game theory, online learning, and boosting
Loss minimization and generalizations of boosting
Boosting, convex optimization, and information geometry
Using confidence-rated weak predictions
Multiclass classification problems
Learning to rank
Attaining the best possible accuracy
Optimally efficient boosting
Boosting in continuous time
Boosting: Foundations and Algorithms
Author(s): Robert E. Schapire, Yoav Freund
Publisher: The MIT Press, Year: 2014
ISBN: 9780262526036
Boosting: Foundations and Algorithms PDF