1
$\begingroup$

Suppose I have three uncorrelated random variables $X, Y$ and $Z$ (discrete or continuous) such that

$$\newcommand{\Cov}{\mathrm{Cov}}\Cov(X,Y)=0;\quad \Cov(Y,Z)=0;\quad \Cov(X,Z)=0 \tag{$\ast$}$$

I am to prove or disprove the fact that

$$E(XYZ) = E(X)\cdot E(Y)\cdot E(Z)$$

It is evening here already and my head is somewhat dumb; now the obvious step I took was that

$$\Cov(XY, Z) = E(XYZ) - E(XY)\cdot E(Z)$$

Now that $X$ and $Y$ are uncorrelated, it implies that

$$\Cov(XY, Z) = E(XYZ) - E(X)\cdot E(Y)\cdot E(Z)$$

If I could prove or disprove that from $(\ast)$ implies $\Cov(XY,Z)=0$, then I would succeed.

Can you please help me with that?

Thank you in advance!

  • 0
    See http://en.wikipedia.org/wiki/Pairwise_independence2011-12-11

1 Answers 1

2

Suppose $X$ and $Y$ are independent and identically distributed and $$ \begin{cases} X = 1 & \text{with probability }1/2, \\ X = 0 & \text{with probability }1/2, \end{cases} $$ and let $Z$ be the mod-$2$ sum of $X$ and $Y$. Find $E(X)$, $E(Y)$, $E(Z)$ and all three covariances. Then find $E(XYZ)$.

(This is the same example I put into this initial edit of a Wikipedia article in 2004, and it's still prominent in the current version. But in that article it served a slightly different purpose.)

  • 0
    Thank you for a hint! How do I prove that $E(XZ) - E(X)\cdot E(Z) = 0$?2011-12-11
  • 0
    Great -- I love it when that happens (in Wikipedia). Let me use the occasion to thank you for all the work you've put into math at Wikipedia -- whenever I delved into something there and looked at the edit history, your name more often than not turned up at some point :-)2011-12-11
  • 0
    @wh1t3cat1k Just find the joint distribution of $X$ and $Z$. You'll find that they're independent.2011-12-11
  • 0
    The variable $XYZ$ is always equal to zero, right?2011-12-11