chapter six
6 Wide Convolutional Neural Networks
This chapter covers
- The wide convolutional layer design pattern
- Understanding the advantages of wide vs deep layers
- Refactoring micro architecture patterns to decrease computational complexity
- Coding former SOTA wide convolutional models with the procedural design pattern
Up to now, we’ve focused on networks with deeper layers, block layers and shortcuts in residual networks for image-related tasks such as classification, object localization, and image segmentation). Now we are going to take a look at networks with wide, rather than deep, convolutional layers. Starting in 2014 with Inception v1 (GoogLeNet) and 2015 with ResNeXt (Microsoft Research) and Inception v2, neural network designs moved into wide layers, reducing the need for going deeper in layers. Essentially, a wide layer design means having multiple convolutions in parallel and then concatenating their outputs.In contrast, deeper layers have sequential convolutions and aggregate their outputs.