Chapter 6. Deep learning for text and sequences

 

This chapter covers

  • Preprocessing text data into useful representations
  • Working with recurrent neural networks
  • Using 1D convnets for sequence processing

This chapter explores deep-learning models that can process text (understood as sequences of words or sequences of characters), timeseries, and sequence data in general. The two fundamental deep-learning algorithms for sequence processing are recurrent neural networks and 1D convnets, the one-dimensional version of the 2D convnets that we covered in the previous chapters. We’ll discuss both of these approaches in this chapter.

Applications of these algorithms include the following:

  • Document classification and timeseries classification, such as identifying the topic of an article or the author of a book
  • Timeseries comparisons, such as estimating how closely related two documents or two stock tickers are
  • Sequence-to-sequence learning, such as decoding an English sentence into French
  • Sentiment analysis, such as classifying the sentiment of tweets or movie reviews as positive or negative
  • Timeseries forecasting, such as predicting the future weather at a certain location, given recent weather data

This chapter’s examples focus on two narrow tasks: sentiment analysis on the IMDB dataset, a task we approached earlier in the book, and temperature forecasting. But the techniques demonstrated for these two tasks are relevant to all the applications just listed, and many more.

6.1. Working with text data

6.2. Understanding recurrent neural networks

6.3. Advanced use of recurrent neural networks

6.4. Sequence processing with convnets

sitemap