13 Fully Bayes model parameter estimation
This chapter covers
- Fully Bayes parameter estimation for unsupervised modeling
- Injecting prior belief into parameter estimation
- Estimating Gaussian likelihood parameters with known or unknown mean and precision
- Normal-gamma and Wishart distributions
Suppose we have a data set of interest: say, all images containing a cat. If we represent images as points in a high-dimensional feature space, our data set of interest forms a subspace of that feature space. Now we want to create an unsupervised model for our data set of interest. This means we want to identify a probability density function p() whose sample cloud (the set of points obtained by repeatedly sampling the probability distribution many times) largely overlaps our subspace of interest. Of course, we do not know the exact subspace of interest, but we have collected a set of samples X from the data set of interest: that is, the training data. We can use the point cloud for X as a surrogate for the unknown subspace of interest. Thus, we are essentially trying to identify a probability density function p(
) whose sample cloud, by and large, overlaps X.
Once we have the model p(), we can use it to generate more data samples; these will be computer-generated cat images. This is generative modeling. Also, given a new image
, we can estimate the probability of it being an image of a cat by evaluating p(
).