Chapter 5. Logistic regression

 

This chapter covers

  • The sigmoid function and the logistic regression classifier
  • Our first look at optimization
  • The gradient descent optimization algorithm
  • Dealing with missing values in the our data

This is an exciting chapter because this is the first chapter where we encounter optimization algorithms. If you think about it, many of the things we do in life are optimization problems. Some examples of optimization from daily life are these: How do we get from point A to point B in the least amount of time? How do we make the most money doing the least amount of work? How do we design an engine to produce the most horsepower while using the least amount of fuel? The things we can do with optimization are powerful. I’ll introduce a few optimization algorithms to train a nonlinear function for classification.

5.1. Classification with logistic regression and the sigmoid function:- : a tractable step function

5.2. Using optimization to find the best regression coefficients

5.3. Example: estimating horse fatalities from colic

5.4. Summary