chapter thirteen

13 Conditional probability with Bayes’ Theorem

 

In this chapter

  • Conditional and joint probabilities
  • Prior and posterior probabilities
  • Likelihood functions
  • Bayes’ theorem
  • Reasoning backwards from effects to causes

In Chapter 12 we implemented types to represent and efficiently sample from categorical distributions. We can also put any projection or filter we want on any discrete distribution. In this chapter, we’re going to combine those basic parts together to unlock their true power: we can use Bayes’ theorem to compute posterior distributions: that is, we can update our prior opinions based on observations. We can build statistical models of the world where we describe the relationship between causes and effects and then compute the probability that a particular cause is correlated to an observed effect. This has implications for medicine, social networks, developer tools and any number of additional fields.

13.1 Bayes’ theorem

13.2 Likelihood functions and joint distributions

13.3 Updating priors by reasoning from effects to causes

13.4 Some applications of Bayesian reasoning

13.4.1 A few real-world examples

13.5 Unconditional likelihood functions are independent

13.6 Summary