16 Markov processes and the Metropolis algorithm
In this chapter
- A Markov process generates a random sequence of states
- Random Markov texts
- Sampling posteriors with the Metropolis algorithm
The final step in this long introduction to principled techniques for using randomness in programs is to be able to sample from an arbitrary continuous distribution. In particular, we want to start with a prior distribution that models the world, condition that distribution with observations of the real world, and thereby deduce a posterior distribution that correctly updates the model. Programs that make decisions in the face of an uncertain world should use principled, mathematically sound algorithms; these tools help us do that.
The Markov process that language models use to generate random text is, surprisingly, also the basis for the Metropolis algorithm for sampling from continuous posterior distributions. Let’s start this final fabulous adventure by looking at Markov processes in general, and then we’ll see how to use them to sample continuous distributions.
16.1 What is a Markov process?
Let’s start with a few examples: