Chapter 7. Improving classification with the AdaBoost meta-algorithm

 

This chapter covers

  • Combining similar classifiers to improve performance
  • Applying the AdaBoost algorithm
  • Dealing with classification imbalance

If you were going to make an important decision, you’d probably get the advice of multiple experts instead of trusting one person. Why should the problems you solve with machine learning be any different? This is the idea behind a meta-algorithm. Meta-algorithms are a way of combining other algorithms. We’ll focus on one of the most popular meta-algorithms called AdaBoost. This is a powerful tool to have in your toolbox because AdaBoost is considered by some to be the best-supervised learning algorithm.

In this chapter we’re first going to discuss different ensemble methods of classification. We’ll next focus on boosting and AdaBoost, an algorithm for boosting. We’ll then build a decision stump classifier, which is a single-node decision tree. The AdaBoost algorithm will be applied to our decision stump classifier. We’ll put our classifier to work on a difficult dataset and see how it quickly outperforms other classification methods.

7.1. Classifiers using multiple samples of the dataset

 

7.2. Train: improving the classifier by focusing on errors

 
 
 

7.3. Creating a weak learner with a decision stump

 
 

7.4. Implementing the full AdaBoost algorithm

 
 
 

7.5. Test: classifying with AdaBoost

 
 

7.6. Example: AdaBoost on a difficult dataset

 
 
 
 

7.7. Classification imbalance

 
 
 

7.8. Summary

 
 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest