1
$\begingroup$

Suppose I have three random variables X,Y,Z. X is independent of Y and of Z. I want to conclude, that $X$ independent from $(Y,Z)$.

$(Y,Z):= \sigma(Y,Z)=\sigma( \{ Y^{-1}(A) | A\in \mathcal{E}_1 \} \cup \{Z^{-1}(B) | B\in \mathcal{E}_2 \})$

(For example to establish $E(X)E(Y1_A)=E(XY1_A)$)

Is there an easy proof or a reference? (The sources I know usually just deal with only 2 random variables and it would be nice to have a better understanding of the details of more than one r.v.)

EDIT: I am sorry, I forgot to write, that I also want to assume that Y and Z are independent. Excuse me. This prevents also the counter-examples you wrote so far I think.

  • 0
    @André So I was badly wrong. The counter-intuitive thing for me is, that in the case where I want to apply it,$Z$is a pseudo-random generated number and$Y$the current state of a physical process, so the notion of a common $\sigma$-algebra $(Y,Z)$ seems quite weird. But because the counter-examples here are so general I guess I will have to include such a condition. Thanks for all the answers2011-08-08

3 Answers 3

1

Let $X$ be the mod-2 sum of $Y$ and $Z$; let $Y,Z$ each be 0 with probability $1/2$ and 1 with probability $1/2$, and suppose $Y,Z$ are independent of each other. Obviously $X$ is not independent of $Y,Z$. But you can see that $X$ is independent of $Y$ by finding the probability that $(X,Y)$ is equal to each of the four pairs $(0,0), (0,1), (1,0), (1,1)$. And in the same way you can see that $X$ is independent of $Z$.

1

$X$ need not be independent of $(Y,Z)$. For example, let $Y$ and $Z$ be independent of each other and both be uniformly distributed on $\{-1,1\}$, and let $X=YZ$.

1

The following is a counterexample.

Choose at random one of the following $9$ three-digit numbers, with all numbers equally likely $ 123 \quad 132 \quad 213 \quad 231\quad 312 \quad 321\quad 111 \quad 222\quad333. $ Let $X$ be the first digit of the chosen number, $Y$ the second digit, and $Z$ the third digit. Then any two of our random variables are independent. But if we know $(Y,Z)$, we know $X$, so $X$ and $(Y,Z)$ are not independent. For completeness, verification of the details is given below.

Counting shows that $P(X=1)=P(X=2)=P(X=3)=1/3$. The analogous assertions are true for $Y$ and $Z$.

By symmetry, to verify pairwise independence, it is enough to deal with $X$ and $Y$. By symmetry between digits, it is enough to show that $P(X=1|Y=i)=1/3$. But in exactly $1$ of the $3$ cases where $Y=1$, we have $X=1$. The same is true for $i=2$ and $i=3$.

The fact that $X$ and $(Y,Z)$ are not independent follows, for example, from the fact that the probability that $X=1$, given that $(Y,Z)=(1,1)$, is $1$.

  • 0
    It is a standard exercise in a probability course, following the definition of "independent", that you can have three random variables, where each pair is independent, but the triple is not independent.2011-08-05