6 Common design building blocks
This chapter covers
- Adding new activation functions
- Inserting new layers to improve training
- Skipping layers as a useful design pattern
- Combining new activations, layers, and skips into new approaches more powerful than the sum of their parts
At this point, we have learned about the three most common and fundamental types of neural networks: fully connected, convolutional, and recurrent. We have improved all of these architectures by changing the optimizer and learning rate schedule, which alter how we update the parameters (weights) of our models, giving us more accurate models almost for free. All of the things we have learned thus far also have a long shelf life and have taught us about problems that have existed for decades (and continue). They give you a good foundation to speak the language of deep learning and some very fundamental building blocks that larger algorithms are made from.