10 Combining models to maximize results:Ensemble learning
This chapter covers
- What is ensemble learning.
- Joining several weak classifiers to form a strong classifier.
- Bagging: A method to randomly join several classifiers.
- Boosting: A method to join several classifiers in a smarter way.
- AdaBoost: A very successful example of boosting methods.
After learning many interesting and very useful machine learning classifiers, a good question to ask is “Is there a way to combine them?”. Thankfully the answer is yes! In this chapter we learn several ways to build stronger classifiers by combining weaker ones. The methods we learn in this chapter are bagging and boosting. In a nutshell, bagging consists on constructing a few classifiers in a random way and putting them together. Boosting, on the other hand, consists of building these models in a smarter way, by picking each model strategically to focus on the previous models’ mistakes. One of the most popular examples of boosting is the AdaBoost algorithm (Adaptive Boosting), which we study at the end of the chapter.