3
$\begingroup$

My question is on Bayesian inference of partitioned multivariate Gaussian. To make things easier, suppose there is a 2-dimensional Guassian,

$ X_1 \sim N(\mu_1, \sigma^2_1) \\ X_2 \sim N(\mu_2, \sigma^2_2) $

with covariance $\sigma_{1,2}$.

Suppose we know $\sigma_{i,j}$, $\sigma^2_1$ and $\sigma^2_2$; don't know $\mu_1$, $\mu_2$ but have priors for them as,

$ \mu_1 \sim N(\theta_1, \delta^2_1) \\ \mu_2 \sim N(\theta_2, \delta^2_2) $

Now we have an observation $x_1$ for $X_1$. By Bayesian inference we get,

\theta'_1 | x_1 = \frac{\delta^2_1 x_1 + \sigma^2_1 \theta_1}{\delta^2_1 + \sigma^2_1} \\ \delta'^2_1 | x_1 = \frac{\delta^2_1 \sigma^2_1}{\delta^2_1 + \sigma^2_1}

and by partitioned Gaussian we have,

$ X_2 | X_1 \sim N \left(\mu_2 + \frac{\sigma_{1,2}}{\sigma^2_1}(x_1 - \mu_1), \sigma^2_2 - \frac{\sigma^2_{1, 2}}{\sigma^2_1} \right) $

Finally my question is, how to update the correlated r.v using Bayesian inference, $ p(\mu_2 | x_1) = \frac{p(x_1 | \mu_2) p(\mu_2)}{p(x_1)} $ since I don't know how to deal with $p(x_1 | \mu_2)$. Or maybe there's other ways around to get it? Hope you get the idea of what I'm trying to do.

Thanks!

  • 0
    @MichaelHardy No. Writing $\theta'_1|x_1$ simply means the updated $\theta_1$ given observation $x_1$. $\theta_1$ is known, which you're also aware of, so it has no distribution at all. Look at the r.h.s of these two equations, are there any r.v involved?2012-03-19

0 Answers 0