The central task in this chapter is Question Answering: answering a question based on a number of facts. This task involves using memory: facts are stored in memory, and the question refers back to past information. How do the various models for sequential processing stack up to this task?
We will demonstrate the difference between flat memory approaches, like recurrent neural networks (RNNs), and long short-term memory (LSTM) networks and responsive memory approaches, like end-to-end memory networks, in the context of Question Answering, and we will assess the benefits of memory networks for Question Answering. In chapter 6, we will apply end-to-end networks to a number of other sequential NLP tasks.