1
$\begingroup$

If I have a normal distribution, the posterior for the variance is the inverse Chi-square distribution assuming the same is used as a conjugate prior. But what if my data has extra noise added so that the observed sample variance is the sum of the population variance and my extra noise variance? But then the poisterior for the variance is different. Is there a name for that distribution?

You can't just subtract the noise term because you can end up with negative values. It is similar to the Skellam distribution of the difference of two Poisson variables in this way.

I am really interested in this from a Gibbs sampler point of view. I would like to draw the variance from the conditional posterior if possible. If that isn't easy I can fall back on Metropolis Hastings, I suppose.

  • 0
    If the population variance you want to estimate is $\sigma^2$ and the error variance is $\tau^2$, then the variance of the actual observation is $\sigma^2+tau^2$ (assuming uncorrelatedness of the two). But if the population variance you want to estimate is $\tau^2$ and the error variance is $\sigma^2$, then the variance of the actual observation is _still_ $\sigma^2+tau^2$. If you observe only their sum, you can't tell the difference. On the other hand, if you observe several realizations of the sum in which the errors are independent but the observation from the population you're...2012-05-12
  • 0
    ...trying to understand are the _same_, then you can do something. Say you've get $X_i + \varepsilon_{i,j}$ for $i=1,\ldots,n$ and $j=1,\ldots,m_i$. Then you have a variance-components problem. Estimates by the method of moments can actually give you negative values for the variances (at least in similar problems; I'm unsure about this particular one). MLEs of course cannot. Nor can Bayesian estimates.2012-05-12
  • 0
    But anyway, could you clarify?2012-05-12

1 Answers 1