Chapters 4, 5, and 6 give a complete overview of classical machine learning algorithms and helps you master the most advanced gradient boosting techniques, such as XGBoost and LightGBM. You will learn how to use each algorithm and apply it to suitable tabular data. Chapter 7 helps you consolidate what you learned with a practical example to demonstrate the complete analytical process when tabular data is involved.
Specifically, chapter 4 introduces Scikit-learn and various classical machine learning methods such as linear regression, logistic regression, and generalized linear models. You will grasp how a data pipeline works from a practical point of view and learn to validate results and compare across different models. We then proceed to chapter 5 and explore decision trees and their ensembles, including bagging, random forests, and gradient boosting decision trees. Then we share a detailed explanation of how the gradient boosting algorithm operates and excels with tabular data. Finally, chapter 5 wraps up with an overview of the different implementations, from Scikit-learn to XGBoost and LightGBM.