Chapter 9. Deep learning for sequences and text

 

This chapter covers

  • How sequential data differs from nonsequential data
  • Which deep-learning techniques are suitable for problems that involve sequential data
  • How to represent text data in deep learning, including with one-hot encoding, multi-hot encoding, and word embedding
  • What RNNs are and why they are suitable for sequential problems
  • What 1D convolution is and why it is an attractive alternative to RNNs
  • The unique properties of sequence-to-sequence tasks and how to use the attention mechanism to solve them
 
 
 

9.1. Second attempt at weather prediction: Introducing RNNs

 
 
 
 

9.2. Building deep-learning models for text

 

9.3. Sequence-to-sequence tasks with attention mechanism

 
 
 
 

Materials for further reading

 
 
 
 

Exercises

 
 

Summary

 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage