9 Random numbers in JAX

This chapter covers

  • Generating (pseudo) random numbers in JAX
  • Differences with NumPy and using keys to represent pseudo-random number generator state
  • Working with keys and generating random numbers in real-life applications

Working with randomness is essential, as many machine learning algorithms use stochasticity in some way. Randomness is used to make random splits and samplings from the data, generate random data, and perform random augmentations. It is also required for for specific neural network–related algorithms like Dropout or architectures like variational auto-encoders (VAE) or generative adversarial networks (GANs) and in hyperparameter tuning to search for better hyperparameter values. Randomness is also essential to the process that is at the heart of deep learning: weight initialization. Surely randomness is no less important in other fields outside of machine learning. Random numbers are at the heart of Monte Carlo simulations and statistical sampling and have many applications in computer simulation of real-world processes.

9.1 Generating random data

 

9.1.1 Loading the dataset

 
 
 

9.1.2 Generating random noise

 
 

9.1.3 Performing a random augmentation

 

9.2 Differences with NumPy

 
 
 
 

9.2.1 How NumPy works

 
 
 

9.2.2 Seed and state in NumPy

 

9.2.3 JAX PRNG

 
 
 

9.2.4 Advanced JAX PRNG configuration

 
 
 
 

9.3 Generating random numbers in real-life applications

 

9.3.1 Building a complete data augmentation pipeline

 
 

9.3.2 Generating random initializations for a neural network

 
 
 
 

Summary

 
 
 
sitemap

Unable to load book!

The book could not be loaded.

(try again in a couple of minutes)

manning.com homepage
test yourself with a liveTest