Chapter 6. Progressing with GANs

 

This chapter covers

  • Progressively growing Discriminator and Generator networks throughout training
  • Making training more stable, and the output more varied and of higher quality and resolution
  • Using TFHub, a new central repository for models and TensorFlow code

In this chapter, we provide a hands-on tutorial to build a Progressive GAN by using TensorFlow and the newly released TensorFlow Hub (TFHub). The Progressive GAN (aka PGGAN, or ProGAN) is a cutting-edge technique that has managed to generate full-HD photorealistic images. Presented at one of the top machine learning conferences, the International Conference on Learning Representations (ICLR) in 2018, this technique made such a splash that Google immediately integrated it as one of the few models to be part of the TensorFlow Hub. In fact, this technique was lauded by Yoshua Bengio—one of the grandfathers of deep learning—as “almost too good to be true.” When it was released, it became an instant favorite of academic presentations and experimental projects.

6.1. Latent space interpolation

 
 

6.2. They grow up so fast

 
 
 

6.2.1. Progressive growing and smoothing of higher-resolution layers

 
 

6.2.2. Example implementation

 

6.2.3. Mini-batch standard deviation

 
 
 

6.2.4. Equalized learning rate

 
 

6.2.5. Pixel-wise feature normalization in the generator

 
 

6.3. Summary of key innovations

 
 

6.4. TensorFlow Hub and hands-on

 

6.5. Practical applications

 
 
 

Summary

 
 
 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage