This chapter covers:
- How strongly supervised end-to-end memory networks can be applied to other sequential NLP problems than just standard Question Answering.
- How to implement a multi-hop memory network that allows for semi-supervised training.
- How strongly supervised memory networks compare to semi-supervised memory networks.
For the data we use, we will find that strongly supervised memory networks easily produce above-baseline results with very little effort. Semi-supervised memory networks produce better accuracies in a number of cases, but not consistently.
The following picture displays the chapter organization: