chapter eight
8 Reduce, reuse, recycle your words (RNNs and LSTMs)
This chapter covers
- Unrolling recursion so you can understand how to use it for NLP
- Implementing word and character-based RNNs in PyTorch
- Identifying applications where RNNs are your best option
- Re-engineering your datasets for training RNNs
- Customizing and tuning your RNN structure for your NLP problems
- Understanding backprop (backpropagation) in time
- Combining long and short term memory mechanisms to make your RNN smarter
An RNN (Recurrent Neural Network) recycles tokens. Why would you want to recycle and reuse your words? To build a more sustainable NLP pipeline of course! ;) Recurrence is just another word for recycling. An RNN uses recurrence to allow it to remember the tokens it has already read and reuse that understanding to predict the target variable. And if you use RNNs to predict the next word, RNNs can generate, going on and on and on, until you tell them to stop. This sustainability or regenerative ability of RNNs is their super power.