5 Sequential NLP and memory

 

In this chapter, you will learn about the use of memory for analyzing sequential NLP tasks. Specifically, this chapter covers:

  • How RNNs, LSTMs, and the newly introduced end-to-end memory networks (memory networks for short) handle memory
  • How these approaches differ in performance and memory capacity
  • How to apply these techniques to a shared task: Question Answering.

The central task in this chapter is Question Answering: answering a question on the basis of a number of facts. This task involves using memory: facts are stored in memory, and the question refers back to past information. How do the various models for sequential processing stack up to this task?

We will demonstrate the difference between "flat memory" approaches like RNNs and LTSMs, and "responsive memory" approaches like end-to-end memory networks in the context of Question Answering, and we will assess the benefits of memory networks for question answering. In the next chapter (Episodic Memory for NLP), we will proceed with applying end-to-end networks to a number of other sequential NLP tasks.

The organization of the chapter is as follows:

Figure 5.1. Chapter organization.
mental model chapter5 all

5.1  Memory and language

5.1.1  The problem: Question Answering

5.2  Data and data processing

5.3  Question Answering with sequential models

5.3.1  RNNs for Question Answering

5.3.2  LSTMs for Question Answering

5.3.3  End-to-end memory networks for Question Answering

5.4  Conclusion

5.5  Further reading

5.6  Data and software resources

sitemap