Gaussian mixture model

In class on Wednesday I briefly described a bivariate mixture model and I "live-coded" an R implementation of a Gibbs sampler for performing posterior inference. In this post I'm just going to revisit these same points. A Gaussian mixture model has a density function that is a weighted sum of Gaussian density functions, with weights… Continue reading Gaussian mixture model

Monte Carlo methods

In the past few classes we have taken a look at Monte Carlo methods, which are computational techniques for doing statistics instead of doing calculus. That is, instead of calculating definite integrals, we instead sample from an appropriate probability distribution and then take sample averages. The guiding expression is just this: $latex \frac{1}{M} \sum_{m =… Continue reading Monte Carlo methods

Bernoulli and Poisson models

Today in class we covered the Beta-Binomial model. That is, we considered a model where for $latex i = 1, \dots, n&s=1$, $latex Y_i \sim \mbox{Bernoulli}(\theta)&s=1$ independently and $latex \theta \sim \mbox{Beta}(a,b)&s=1$. For this model, the total number of successes, or $latex S = \sum_i Y_i&s=1$, is a sufficient statistic for the parameter $latex \theta&s=1$… Continue reading Bernoulli and Poisson models

Potential outcomes

The potential outcome formalism of Donald Rubin and Jerzy Neyman is a key development in modern causal inference. One of our textbooks has a pretty good list of scholarly references. See also this blog post of Gelman's for some interesting discussion about the intellectual history of potential outcomes in economics. The assigned paper by Holland defines the fundamental problem of causal… Continue reading Potential outcomes