Chapter 3. Your first GAN: Generating handwritten digits

 

This chapter covers

  • Exploring the theory behind GANs and adversarial training
  • Understanding how GANs differ from conventional neural networks
  • Implementing a GAN in Keras, and training it to generate handwritten digits

In this chapter, we explore the foundational theory behind GANs. We introduce the commonly used mathematical notation you may encounter if you choose to dive deeper into this field, perhaps by reading a more theoretically focused publication or even one of the many academic papers on this topic. This chapter also provides background knowledge for the more advanced chapters, particularly chapter 5.

From a strictly practical standpoint, however, you don’t have to worry about many of these formalisms—much as you don’t need to know how an internal combustion engine works to drive a car. Machine learning libraries such as Keras and TensorFlow abstract the underlying mathematics away from us and neatly package them into importable lines of code.

This will be a recurring theme throughout this book; it is also true for machine learning and deep learning in general. So, if you are someone who prefers to dive straight into practice, feel free to skim through the theory section and skip ahead to the coding tutorial.

3.1. Foundations of GANs: Adversarial training

3.1.1. Cost functions

3.1.2. Training process

3.2. The Generator and the Discriminator

3.2.1. Conflicting objectives

3.2.2. Confusion matrix

3.3. GAN training algorithm

3.4. Tutorial: Generating handwritten digits

3.4.1. Importing modules and specifying model input dimensions

3.4.2. Implementing the Generator

3.4.3. Implementing the Discriminator

3.4.4. Building the model

3.4.5. Training

3.4.6. Outputting sample images

3.4.7. Running the model

3.4.8. Inspecting the results

3.5. Conclusion

Summary

sitemap