0
$\begingroup$

Is it true that if two indicator variables are independent then they are uncorrelated?

If covariance =0 $\Rightarrow$ uncorrelated

Does the covariance between indicator variables exist?

thx

  • 0
    @Michael: Thanks for the clarification. Corium: you may want to restate your question in a more clean and clear way, please.2011-12-04

1 Answers 1

3

It's true if there are only two of them; it's not true for more than two.

The proof is simple: Suppose $ X=\begin{cases} 1 & \text{with probability }p, \\ 0 & \text{with probability } 1-p, \end{cases} $ $ Y=\begin{cases} 1 & \text{with probability }r, \\ 0 & \text{with probability } 1-r. \end{cases} $ They are independent if and only if $\Pr(X=1\ \&\ Y=1)=pq$. The covariance is $ E(XY) - E(X)E(Y) = E(XY) - pq. $ Notice that $E(XY) = 0\cdot\Pr(XY=0) + 1\cdot\Pr(XY=1) = \Pr(XY=1)$, and that $XY=1$ if and only if $X=1$ and $Y=1$. So if the covariance is $0$ then $\Pr(X=1\ \&\ Y=1)=pq$.

The simplest instance of the fact that it doesn't work for more than two random variables is to let $Z$ be the mod-$2$ sum of $X$ and $Y$ and let $p=q=1/2$. Then $X$, $Y$, and $Z$ are pairwise independent, but clearly not independent. Hence uncorrelated but not independent.

  • 0
    Oh. I missed that because the converse is trivial, and what I proved applies to indicator (or "Bernoulli") random variables but not more generally. I can't help but suspect this is what he intended to ask. Unless he was simply wondering whether covariances of such random variables exist, which is also trivial.2011-12-04