List of Tables

 

Chapter 1. Probabilistic programming in a nutshell

Table 1.1. Quantifying the probabilities in the Hello World example

Chapter 4. Probabilistic models and probabilistic programs

Table 4.1. A probability distribution, defined by assigning probabilities to possible worlds that sum to 1

Table 4.2. Our probabilistic model revisited. Forgery is much more likely than Rembrandt, so the probability of w1 is more than the other three put together. Similarly, if the painter is Rembrandt, it’s more likely to be a painting of people than a landscape, and a dark painting of people is more likely than a bright one.

Table 4.3. The model and evidence comprising your knowledge in this situation. Each piece of evidence is annotated with its relevance to the model.

Table 4.4. Posterior probability distribution after conditioning on the evidence. First, you reduce the probability of inconsistent worlds to 0. Then, you normalize by dividing each of the remaining probabilities by the sum of the probabilities of consistent worlds. This ensures that the probabilities sum to 1.

Table 4.5. Examples of variables, their types, and possible values

Table 4.6. Possible worlds for the variables Rembrandt, Brightness, and Subject. There’s a possible world for each combination of values of these variables.

Table 4.7. The form of a CPD of Size given Subject

Table 4.8. The filled-in CPD of Size, given Subject

Chapter 5. Modeling dependencies with Bayesian and Markov networks