Skip to main content
Skip to main menu


A simple way to incorporate prior information on margins in Bayesian latent class models

Jerry Reiter
Duke University
Room 306, Statistics Building 1130

I present an approach to incorporating informative prior beliefs about marginal probabilities into Bayesian latent class models for categorical data. The basic idea is to append synthetic observations to the original data such that (i) the empirical distributions of the desired margins match those of the prior beliefs, and (ii) the values of the remaining variables are left missing. The degree of prior uncertainty is controlled by the number of augmented records. Posterior inferences can be obtained via typical MCMC algorithms for latent class models, tailored to deal efficiently with the missing values in the concatenated data. I describe applications of the approach including (i) how augmented records can be used to correct for stratified sampling when estimating latent class models and (ii) how augmented data can be used to relax conditional independence assumptions in data fusion.

Support us

We appreciate your financial support. Your gift is important to us and helps support critical opportunities for students and faculty alike, including lectures, travel support, and any number of educational events that augment the classroom experience. Click here to learn more about giving.

Every dollar givenĀ has a direct impact upon ourĀ students and faculty.