8 Bayesian neural networks

 

This chapter covers

  • Two approaches to fit Bayesian neural networks (BNNs)
  • The variational inference (VI) approximation for BNNs
  • The Monte Carlo (MC) dropout approximation for BNNs
  • TensorFlow Probability (TFP) variational layers to build VI-based BNNs
  • Using Keras to implement MC dropout in BNNs

In this chapter, you learn about two efficient approximation methods that allow you to use a Bayesian approach for probabilistic DL models: variational inference (VI) and Monte Carlo dropout (also known as MC dropout). When setting up a Bayesian DL model, you combine Bayesian statistics with DL. (In the figure at the beginning of this chapter, you see a combination portrait of Reverend Thomas Bayes, the founder of Bayesian Statistics, and Geoffrey Hinton, the leader and one of the godfathers of DL.) With these approximation methods, fitting Bayesian DL models with many parameters becomes feasible. As discussed in chapter 7, the Bayesian method also takes care of the epistemic uncertainty that’s not included in non-Bayesian probabilistic DL models.

8.1 Bayesian neural networks (BNNs)

8.2 Variational inference (VI) as an approximative Bayes approach

8.2.1 Looking under the hood of VI*

8.2.2 Applying VI to the toy problem*

8.3 Variational inference with TensorFlow Probability

8.4 MC dropout as an approximate Bayes approach

8.4.1 Classical dropout used during training

8.4.2 MC dropout used during train and test times

8.5 Case studies

8.5.1 Regression case study on extrapolation

8.5.2 Classification case study with novel classes

Summary

sitemap