chapter seven
7 Learning with Continuous and Count Labels
This chapter covers
- An introduction to regression in machine learning
- Understanding loss and likelihood functions for regression
- Understanding when to use different loss and likelihood functions
- Adapting parallel and sequential ensembles for regression problems
- Using ensembles for regression in practical settings
Many real-world modeling, prediction and forecasting problems are best framed and solved as regression problems. Regression has a rich history predating the advent of machine learning and has long been a part of the standard statistician’s toolkit.
Regression techniques have been developed and widely applied in many areas. Here are just a few examples:
- Weather forecasting: to predict the precipitation tomorrow using data from today, including temperature, humidity, cloud cover, wind and more.
- Insurance analytics: to predict the number of automobile insurance claims over a period of time, given various vehicle and driver attributes.
- Financial forecasting: to predict stock prices using historical stock data and trends.
- Demand forecasting: predict the residential energy load for the next three months using historical, demographic and weather data.
Whereas Chapters 2-6 introduced ensembling techniques for classification problems, in this chapter, we will see how to adapt ensembling techniques to regression problems.