-1
$\begingroup$

Suppose that $Y_1, \ldots, Y_n \sim N(\mu, 1)$ are independent and that $Z_1, \ldots, Z_n \sim N(\mu+\beta, 1)$ are independent as well and that The $Y_i$'s are also independent of $Z_i$'s. Furthermore, $\mu \sim N(0, \lambda_{\mu})$ and $\beta \sim N(0, \lambda_{\beta})$.

I have two connected questions:

1) I am wondering if we have that $P(Y,Z|\mu) = P(Y|\mu)P(Z|\mu)$. I am confused here because this by itself implies conditional independence, that is, independence given $\mu$. However, both $Y$ and $Z$ come with $\mu$ already, so aren't they by default conditional already?

2) What about $P(YZ) = P(Y)P(Z)$? Is this true? I surmise that we should have that $P(Z)$ or $P(Y)$ is the marginal of $P(Y|\mu)$, or:

$$ P(Y) = \int P(Y,\mu) d\mu = \int P(Y|\mu)P(\mu)d\mu $$

Is this correct? Thanks!

1 Answers 1

1
  1. Yes, the distribution of $Y_1,Y_2\ldots \sim N(0,\mu)$ you defined is a distribution conditional on $\mu,\beta$ (otherwise what could it mean) so you have conditional independence.

  2. No. A common counterexample comes from bayesian statistics of coin flipping. Say you have a coin with unknown probability of heads $p$ and that $p$ is chosen from a distribution $P(p).$ In this example, like yours, the $X_1,\ldots,X_n,$ the values of the coinflips are conditionally independent given $p$, where $p$ is the unknown probability of heads. Are they unconditionally independent, though? It turns out the answer is no. The easy way to see this (other than playing with the formulae) is to think about what happens when you're flipping coins. Say a large majority the first several flips are heads. Then, assuming the process that generated $p$ has $P(p > 1/2)>0.$ The most likely explanation is that we drew a $p>1/2$ and the large fraction of heads is a reflection of this. Thus we will probably see a high fraction of heads in the future. Same goes if there's a lot of tails. Thus the values of the coinflips $X_1, X_2,\ldots$ are (unconditionally) positively correlated and thus dependent.