3
$\begingroup$

As I keep reading probability books, there are always some issues that no one considers.

For example,

for $\omega \in \Omega$ and $X$, $Y$ independent random variable we define $Z(\omega )=X(\omega )\cdot Y(\omega)$, So if $E[X]$ , $E[Y]$ , $E[Z]$ defined, we know that $E[X]\cdot E[Y]=E[Z]$.

But, I really curious whether there's a situation when $E[X]$, $E[Y]$ defined, but $E[X\cdot Y]$ ($E[Z]$) is $\infty$ or even Diverging? I wasnt able to think of an answer.

(Is it ok to post more than one question in the same day?)

Thanks again.

  • 0
    The formula $E[XY]=E[X]E[Y]$, when $X$ and $Y$ are independent integrable random variables, can be proved using the Monotone Convergence Theorem. But this is not elementary...2011-04-07

2 Answers 2

1

No. If $X,Y$ are integrable (i.e. $E|X| < \infty$, $E|Y|<\infty$) and independent, then $Z=XY$ is integrable.

The first general proof I can think of is to use the distribution measures $\mu_X$, $\mu_Y$ for $X,Y$. We have $E|Z| = \iint |xy|\mu_X(dx)\mu_Y(dy)$, which by Tonelli's theorem equals $\int |x| \mu_X(dx) \int |y| \mu_Y(dy)$. But this is just $E|X| E|Y|$ which is finite.

  • 0
    Yuval, also not. can you explain me in general why it is not possible? without any formal prove?2011-04-06
1

You can think of it this way. If $X$ and $Y$ are independent, the conditional distribution of $|Y|$ given $X$ is the same as the distribution of $|Y|$ itself. So $E[|X| |Y| | X] = |X| E[|Y| | X] = |X| E[|Y|]$, and $E[|X| |Y|] = E [E [|X| |Y| | X ]] = E[|X|] E[|Y|]$.

  • 0
    See comment above.2011-04-11