Part 2. Optimizing deep learning

 

In this part of the book, we look at evolutionary and genetic algorithms that may be used to optimize and improve deep learning systems. We start in chapter 5 by solving a core problem in deep learning: hyperparameter optimization. This chapter demonstrates various methods, from random and grid search to genetic algorithms, particle swarm optimization, evolutionary strategies, and differen- tial evolution.

In chapter 6, we move into neuroevolution with the optimization of deep learning architecture and parameters. We demonstrate how network parameters or weights can be optimized without the need to use backpropagation or deep learning optimizers.

Then in chapter 7, we continue demonstrating neuroevolution for the enhancement architecture and parameter enhancement of convolutional neural networks. We then look at developing an EvoCNN network model using custom architecture encodings with genetic algorithms.