You’ve probably heard a lot about “random forests,” “XGBoost,” or “gradient boosting.” Someone always seems to be using one or another of these to build cool applications or win Kaggle competitions. Have you ever wondered what this fuss is all about?
The fuss, it turns out, is all about ensemble methods, a powerful machine-learning paradigm that has found its way into all kinds of applications in health care, finance, insurance, recommendation systems, search, and a lot of other areas.
This book will introduce you to the wide world of ensemble methods, and this part will get you going. To paraphrase the incomparable Julie Andrews from The Sound of Music,