chapter thirteen
13 Conditional probability with Bayes’ Theorem
In this chapter
- Conditional and joint probabilities
- Prior and posterior probabilities
- Likelihood functions
- Bayes’ theorem
- Reasoning backwards from effects to causes
In Chapter 12 we implemented types to represent and efficiently sample from categorical distributions. We can also put any projection or filter we want on any discrete distribution. In this chapter, we’re going to combine those basic parts together to unlock their true power: we can use Bayes’ theorem to compute posterior distributions: that is, we can update our prior opinions based on observations. We can build statistical models of the world where we describe the relationship between causes and effects and then compute the probability that a particular cause is correlated to an observed effect. This has implications for medicine, social networks, developer tools and any number of additional fields.