Part 2 Deeper learning: Neural networks
Part 1 gathered the tools for natural language processing and dove into machine learning with statistics-driven vector space models. You discovered that even more meaning could be found when you looked at the statistics of connections between words. You also learned about algorithms, such as latent semantic analysis (LSA), which can help make sense of those connections by gathering words into topics. But part 1 considered only linear relationships between words, and you often had to use human judgment to design feature extractors and select model parameters.
In part 2, you will peel open the “black box” that is deep learning. You will learn how to model text in deep, nonlinear ways. Chapter 5 gives you a primer on neural networks. Then, in chapter 6, you learn about word vectors and how they astounded even natural language experts. In chapters 6 through 8, you gradually build up layers of complexity in your neural network language models, and you start to see how neural networks can recognize patterns in the order of words rather than just their presence and absence.