In this chapter, you learn about two efficient approximation methods that allow you to use a Bayesian approach for probabilistic DL models: variational inference (VI) and Monte Carlo dropout (also known as MC dropout). When setting up a Bayesian DL model, you combine Bayesian statistics with DL. (In the figure at the beginning of this chapter, you see a combination portrait of Reverend Thomas Bayes, the founder of Bayesian Statistics, and Geoffrey Hinton, the leader and one of the godfathers of DL.) With these approximation methods, fitting Bayesian DL models with many parameters becomes feasible. As discussed in chapter 7, the Bayesian method also takes care of the epistemic uncertainty that’s not included in non-Bayesian probabilistic DL models.