3
$\begingroup$

If we say $X$ has a uniform distribution on $\{-1,0,1\}$ and let $Y=X^2$, are $X$ and $Y$ uncorrelated and are they independent? I would say that they are not independent since $Y$ clearly depends on $X$, but a friend told me that that's not correct. How would I show that they are dependent? (Or maybe he is correct?)

Also I said that they were correlated because $Y$ changes as $X$ changes, meaning correlation right? I'm just feeling doubtful now. Some help please?

  • 1
    Can you edit your question title to something more specific/informative?2012-12-13

3 Answers 3

5

Consider for example $\mathbb{P}[X=-1,Y=1]$:

$\mathbb{P}[X=-1,Y=1] = \mathbb{P}[X=-1] = \frac{1}{3}$

using that $Y=X^2$, but on the other hand

$\mathbb{P}[X=-1] \cdot \mathbb{P}[Y=1] = \frac{1}{3} \cdot \frac{2}{3} = \frac{2}{9} \not= \frac{1}{3}$

This means that $X$, $Y$ cannot be independent.

Concerning correlation: Obviously $\mathbb{E}X = 0$ and

$\mathbb{E}(X \cdot Y) = \mathbb{E}(X^3) = \frac{1}{3} \cdot (-1)^3+ \frac{1}{3} \cdot 0 + \frac{1}{3} \cdot 1^3 = 0 = \mathbb{E}X \cdot \mathbb{E}Y$

... so by definition $X$ and $Y$ are uncorrelated.

  • 0
    In general you can show the following: if $X$ is a symmetric random variable, i.e. P(X>a)= P(X<-a) for all $a \in \mathbb{R}$, then $X$ and $X^2$ are uncorrelated but not independent.2016-12-13
2

Your intuition is right that when one variable is a function of the other, they are usually not independent, (although they may be uncorrelated). In this case, that rule of thumb is correct and others have given the computations and counterexamples demonstrating that.

However, remember this rule of thumb is not always right. For instance, let $X$ and $Z$ be independent RVs, having value $1$ with probability $1/2$ and $-1$ with probability $1/2.$ Then let $Y = XZ.$ $Y$ depends on $X$ in a functional sense, but it is actually independent of $X$.

However, if you replace $X$ and $Z$ in my example with standard normal variables and set $Y = XZ,$ then $X$ and $Y$ are not independent (although they are uncorrelated).

The key is that dependence means knowing one variable tells you something about the the other. In the example with normal random variables, knowing $X=1$ tells you that $Y = \pm1,$ which is new information about $Y$ since it could have been any real number. In contrast, in the example where $X$ and $Z$ are $\pm1,$ learning $X = 1$ gives you no probabilistic information about $Y.$ It's still $\pm 1$ with 50-50 probability, just like before you knew $X.$

1

\begin{align}X: \binom{-1 \\0 \\1}\ w.p \binom {1/3 \\1/3\\ 1/3}\\\\ Y: \binom{\\0 \\1}\ w.p \binom {1/3 \\ 2/3}\\\\ XY: \binom{-1 \\0 \\1}\ w.p \binom {1/3 \\1/3\\ 1/3}\\\\ E(X)=0\\ E(Y)=2/3\\ E(X)=0\\ E(XY)=0\\ E(X)E(Y)=0 \end{align} Uncorrelated!

Also, \begin{align} p_{Y|X}(y|X=x)&=\begin{cases} 1&x=-1,1\\0&x=0 \end{cases}\\ &\neq p_Y(y) \end{align}