22
$\begingroup$

Does anyone have an example of two dependent random variables, that satisfy this relation?

$E[f(X)f(Y)]=E[f(X)]E[f(Y)]$ for every function $f(t)$.

Thanks.

*edit: I still couldn't find an example. I think one should be of two identically distributed variables, since all the "moments" need to be independent: $Ex^iy^i=Ex^iEy^i$. That's plum hard...

  • 0
    So now I need to prove that there exist two variables $X,Y$ that satisfy the orig. equation but don't satisfy P(X. I can't really find any example...2012-11-29

5 Answers 5

17

Here is a counterexample. Let $V$ be the set $\lbrace 1,2,3 \rbrace$. Consider random variables $X$ and $Y$ with values in $V$, whose joint distribution is defined by the following matrix :

$ P=\left( \begin{matrix} \frac{1}{10} & 0 & \frac{1}{5} \\ \frac{1}{5} & \frac{1}{10} & 0 \\ \frac{1}{30} & \frac{7}{30} & \frac{2}{15} \end{matrix} \right)= \left( \begin{matrix} \frac{3}{30} & 0 & \frac{6}{30} \\ \frac{6}{30} & \frac{3}{30} & 0 \\ \frac{1}{30} & \frac{7}{30} & \frac{4}{30} \end{matrix}\right) $

Thus, for example, $P(X=1,Y=2)=0$ while $P(X=1)P(Y=2)=(\frac{1}{10} + \frac{1}{5})(\frac{1}{10} + \frac{7}{30}) >0$. So $X$ and $Y$ are not independent.

Let $f$ be an ARBITRARY (I emphasize this point because of a comment below) function defined on $X$ ; put $x=f(1),y=f(2),z=f(3)$. Then

$ \begin{array}{rcl} {\mathbf E}(f(X)) &=& \frac{3(x+y)+4z}{10} \\ {\mathbf E}(f(Y)) &=& \frac{x+y+z}{3} \\ {\mathbf E}(f(X)){\mathbf E}(f(Y)) &=& \frac{3(x+y)^2+7(x+y)z+4z^2}{30} \\ {\mathbf E}(f(X)f(Y)) &=& \frac{3x^2+6xy+3y^2+7xz+7yz+4z^2}{30} \\ \end{array} $

The last two are equal, qed.

  • 0
    @EwanDelanoy: No, simply that this _is_ an example of what the question is looking for. Perhaps you're answering the "edit:" which suggests "two identically distributed variables" are required.2012-12-04
6

Here is a continuous counterexample. It has discrete analogues, some of which are given at the end. We start with a general remark, which may help understand where the counterexamples come from. (For a short version of the answer, go to the picture and ponder it, really it says everything.)

Assume that $(X,Y)$ has PDF $p$ and that there exists some measurable function $q$ such that, for every $(x,y)$, $ p(x,y)+p(y,x)=2q(x)q(y). $ One can assume without loss of generality that $q$ is a PDF. Then, for every function $f$, $ E[f(X)f(Y)]=\iint f(x)f(y)p(x,y)\mathrm dx\mathrm dy=\iint f(x)f(y)p(y,x)\mathrm dx\mathrm dy, $ hence $ E[f(X)f(Y)]=\iint f(x)f(y)q(x)q(y)\mathrm dx\mathrm dy=\left(\int f(x)q(x)\mathrm dx\right)^2. $ Thus, if, furthermore, $ q(x)=\int p(x,y)\mathrm dy=\int p(y,x)\mathrm dy, $ then indeed, $ E[f(X)f(Y)]=E[f(X)]E[f(Y)]. $ Now, a specific counterexample: assume that $(X,Y)$ is uniformly distributed on the set $ D=\{(x,y)\in[0,1]^2\,;\,\{y-x\}\leqslant\tfrac12\}, $ where $\{\ \}$ is the function fractional part. In words, $D$ (the black part in the image of the square $[0,1]^2$ below) is the union of the parts of the square $[0,1]^2$ between the lines $y=x+\frac12$ and $y=x$, and below the line $y=x-\frac12$.

$\hskip2in$

Then $(Y,X)$ is uniformly distributed on the image of $D$ by the symmetry $(x,y)\to(y,x)$ (the white part in the image of the square $[0,1]^2$ above), which happens to be the complement of $D$ in the square $[0,1]^2$ (actually, modulo some lines, which have zero Lebesgue measure). Thus, our first identity holds with $q=\mathbf 1_{[0,1]}$, that is:

If $(X,Y)$ is uniformly distributed on $D$, then $X$ and $Y$ are both uniformly distributed on $[0,1]$, $(X,Y)$ is not independent, and, for every function $f$, $E[f(X)f(Y)]=E[f(X)]E[f(Y)]$.

Note that $(X,Y)$ can be constructed as follows. Let $U$ and $V$ be i.i.d. uniform on $[0,1]$, then $(X,Y)=(U,V)$ if $(U,V)$ is in $D$, else $(X,Y)=(V,U)$.

An analogue of this in the discrete setting is to consider $(X,Y)$ with joint distribution on the set $\{a,b,c\}^2$ described by the matrix $ \frac19\begin{pmatrix}1&2&0\\0&1&2\\2&0&1\end{pmatrix}. $ Then the random set $\{X,Y\}$ is distributed like $\{U,V\}$ where $U$ and $V$ are i.i.d. uniform on $\{a,b,c\}$. For example, considering $S=\{(a,b),(b,a)\}$, one sees that $ P[(X,Y)\in S]=\tfrac29=P[(U,V)\in S], $ since $[(X,Y)\in S]=[(X,Y)=(a,b)]$, while $ P[(U,V)\in S]=P[(U,V)=(a,b)]+P[(U,V)=(b,a)]=\tfrac19+\tfrac19. $ Thus, $ E[f(X)f(Y)]=E[f(U)f(V)]=E[f(U)]E[f(V)]=E[f(X)]E[f(Y)]. $ This example can be extended to any sample space of odd size. A more general distribution on $\{a,b,c\}^3$ is, for every $|t|\leqslant1$, $ \frac19\begin{pmatrix}1&1+t&1-t\\1-t&1&1+t\\1+t&1-t&1\end{pmatrix}. $

-1

The relation $E[XY]=E[X]E[Y]$ holds if and only if the covariance of $X$ and $Y$ is zero, that is, $X$ and $Y$ are uncorrelated.

So, you are actually asking is there dependent but uncorrelated random variables $X$ and $Y$ such that every function $f$ would preserve the uncorrelatedness.

Consider a function $f(x) = a$, where $a$ is some constant. Now, $f(X) = a$ and $f(Y) = a$, and therefore $f(X)$ and $f(Y)$ are correlated. Thus, the answer to your question is that there are no such variables.

  • 1
    *therefore f(X) and f(Y) are correlated*... The opposite holds: these are **UN-** correlated. (Upvoters: why the upvote?)2013-07-17
-2

If you take dependent random variables $X$ and $Y$, and set $X^{'} = X - E[X]$ and $Y^{'} = Y - E[Y]$, then $E[f(X^{'})f(Y^{'})]=E[f(X^{'})]E[f(Y^{'})]=0$ as long as $f$ preserves the zero expected value. I guess you cannot show this for all $f$.

-5

If $X$ and $Y$ are dependent then this equality can not hold for every function $f$.You may find some $f$ such that this is true but not for all $f$. This is by definition of independent random variables.

  • 3
    How is this by definition? There's a definition which specifies that if for every f,g this holds (replacing one f with g in the original message) then they're independent. Here it's a weaker condition.2012-11-26