chapter eight
8 Bayesian neural networks
This chapter covers
- Two approaches to fit Bayesian neural networks (BNN)
- The variational inference (VI) approximation for BNNs
- The Monte Carlo dropout approximation for BNNs
- TensorFlow Probability (TFP) variational layers to build VI-based BNNs
- Using Keras to implement Monte Carlo dropout in BNNs
In this chapter you learn about two efficient approximation methods that allow you to use a Bayesian approach for probabilistic DL models: variational inference (VI) and Monte Carlo dropout (also known as MC dropout). When setting up a Bayesian DL model, you combine Bayesian statistics with DL. In the figure above you thus see a combination of Reverend Thomas Bayes, the founder of Bayesian Statistics, in his preaching gown with Geoffrey Hinton, one of the godfathers of deep learning. With these approximation methods, fitting Bayesian DL models with many parameters becomes feasible. As discussed in chapter 7, the Bayesian method also takes care of the epistemic uncertainty that’s not included in non-Bayesian probabilistic DL models.