5 Sequential NLP

 

This chapter covers

  • Using memory to analyze sequential NLP tasks
  • Understanding how RNNs, LSTM networks, and end-to-end memory networks handle memory
  • Applying these techniques to a shared task: Question Answering

The central task in this chapter is Question Answering: answering a question based on a number of facts. This task involves using memory: facts are stored in memory, and the question refers back to past information. How do the various models for sequential processing stack up to this task?

We will demonstrate the difference between flat memory approaches, like recurrent neural networks (RNNs), and long short-term memory (LSTM) networks and responsive memory approaches, like end-to-end memory networks, in the context of Question Answering, and we will assess the benefits of memory networks for Question Answering. In chapter 6, we will apply end-to-end networks to a number of other sequential NLP tasks.

5.1 Memory and language

5.1.1 The problem: Question Answering

5.2 Data and data processing

5.3 Question Answering with sequential models

5.3.1 RNNs for Question Answering

5.3.2 LSTMs for Question Answering

5.3.3 End-to-end memory networks for Question Answering

Summary

sitemap