6 Regression algorithms

 

This chapter covers

  • Introducing regression
  • Bayesian linear regression
  • Hierarchical Bayesian regression
  • KNN regression
  • Gaussian process regression

In the previous chapter, we looked at supervised algorithms for classification. In this chapter, we will focus on supervised learning, in which we are trying to predict a continuous quantity. We will study four intriguing regression algorithms, which we will derive from the first principles: Bayesian linear regression, hierarchical Bayesian regression, KNN regression, and Gaussian process regression. These algorithms were selected because they span several important applications and illustrate diverse mathematical concepts. Regression algorithms are useful in a variety of applications. For example, they can be used to predict the price of financial assets or predict CO2 levels in the atmosphere. Let’s begin by reviewing the fundamentals of regression.

6.1 Introduction to regression

In supervised learning, we are given a dataset D = {(x1, y1), …, (xn, yn)} consisting of tuples of data x and labels y. The goal of a regression algorithm is to learn a mapping from inputs x to outputs y, where y is a continuous quantity (i.e., yR).

6.2 Bayesian linear regression

6.3 Hierarchical Bayesian regression

6.4 KNN regression

6.5 Gaussian process regression

6.6 Exercises

Summary