Chapter 7. Improving classification with the AdaBoost meta-algorithm
This chapter covers
If you were going to make an important decision, you’d probably get the advice of multiple experts instead of trusting one person. Why should the problems you solve with machine learning be any different? This is the idea behind a meta-algorithm. Meta-algorithms are a way of combining other algorithms. We’ll focus on one of the most popular meta-algorithms called AdaBoost. This is a powerful tool to have in your toolbox because AdaBoost is considered by some to be the best-supervised learning algorithm.
In this chapter we’re first going to discuss different ensemble methods of classification. We’ll next focus on boosting and AdaBoost, an algorithm for boosting. We’ll then build a decision stump classifier, which is a single-node decision tree. The AdaBoost algorithm will be applied to our decision stump classifier. We’ll put our classifier to work on a difficult dataset and see how it quickly outperforms other classification methods.