6 A continuous approach to splitting points: Logistic classifiers

 

In this chapter

  • the difference between hard assignments and soft assignments in classification models
  • the sigmoid function, a continuous activation function
  • discrete perceptrons vs. continuous perceptrons, also called logistic classifiers
  • the logistic regression algorithm for classifying data
  • coding the logistic regression algorithm in Python
  • using the logistic classifier in Turi Create to analyze the sentiment of movie reviews
  • using the softmax function to build classifiers for more than two classes

In the previous chapter, we built a classifier that determined if a sentence was happy or sad. But as we can imagine, some sentences are happier than others. For example, the sentence “I’m good” and the sentence “Today was the most wonderful day in my life!” are both happy, yet the second is much happier than the first. Wouldn’t it be nice to have a classifier that not only predicts if sentences are happy or sad but that gives a rating for how happy sentences are—say, a classifier that tells us that the first sentence is 60% happy and the second one is 95% happy? In this chapter, we define the logistic classifier, which does precisely that. This classifier assigns a score from 0 to 1 to each sentence, in a way that the happier a sentence is, the higher the score it receives.

Logistic classifiers: A continuous version of perceptron classifiers

 
 
 

How to find a good logistic classifier? The logistic regression algorithm

 
 
 

Coding the logistic regression algorithm

 
 
 
 

Real-life application: Classifying IMDB reviews with Turi Create

 
 

Classifying into multiple classes: The softmax function

 

Summary

 
 
 

Exercises

 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest