0
$\begingroup$

I have a question regarding the following theorem:

Discrete random variables $X$ and $Y$ on $(\Omega,\mathcal{F},\mathbb P)$ are independent if and only if

$\mathbb E(g(X)h(Y))=\mathbb E(g(X))\mathbb E(h(Y))$

for all functions $g,h\colon \mathbb R\to \mathbb R$ for which the last two expectations exist.

The proof goes as follows:

The necessity of the theorem follows just as in the proof of theorem //. To prove sufficiency, let $a,b\in \mathbb R$ and define $g$ and $h$ by

$\begin{align}g(x)=\begin{cases}1&if\ x=a\\0&if \ x\neq a,\end{cases} && g(x)=\begin{cases}1&if\ x=b\\0&if \ x\neq b,\end{cases}\end{align}$

Then $\mathbb E(g(X)h(Y))=\mathbb P(X=a,Y=b)$

and

$\mathbb E(g(X))\mathbb E(h(Y))=\mathbb P(X=a)\mathbb P(Y=b)$

giving that $p_{X,Y}(a,b)=p_X(a)p_Y(b)$.

Now, in my eyes, the proof is giving an example of the theorem instead a general proof for all functions... So, instead of working with a random functions $g$ and $h$, we work with two specific functions $g$ and $h$ and show that the theorem holds. Can someone explain to me how this is a proof for the theorem?

To be clear: I am only interested in the "sufficiency" part of the proof!

  • 0
    But, in "sufficiency" part of the proof, you only need to prove that $X$ and $Y$ are independent, provided that condition (1) holds for all functions $f$ and $g$. To do this, you need just indicator functions. See my answer below (I added a bit more explanation).2017-01-18
  • 0
    Please don't use proof-verification tag as the only tag on your question. Use other tags to indicate what area of mathematics the proof comes from.2017-02-01

1 Answers 1

0

The proof says:

The necessity of the theorem follows just as in the proof of theorem //.

So for the proof of one direction of implication, that if $X$ and $Y$ are independent random variables, then $$\mathbb E(g(X)h(Y))=\mathbb E(g(X))\mathbb E(h(Y)) \tag{1}$$ for all functions $g,h: \mathbb R \to \mathbb R$ for which the last two expectations exist, they reference the proof of some other theorem.

The rest is the proof of the other direction, that the above condition is sufficient for independence of $X$ and $Y$. To prove this, it is enough to observe the case when $g$ and $h$ are indicator functions.

Recall that discrete random variables $X$ and $Y$ will be independent iff $$P(X=a, Y=b)=P(X=a)P(Y=b)$$ for every $a,b \in \mathbb R$. And, indeed, if for an abritrary $a,b \in \mathbb R$ we put $$\begin{align}f(x)=\begin{cases}1,&\text{if } x=a\\0,&\text{if } x\neq a\end{cases} && g(x)=\begin{cases}1,&\text{if } x=b\\0,&\text{if } x\neq b\end{cases}\end{align}$$ then, from (1) we get $$P(X=a, Y=b)=E(f(X)g(Y))=E(f(X))E(g(Y))=P(X=a)P(Y=b).$$

  • 0
    Thank you for your answer. I asked my teacher as well, and I think I got it now. The sufficiency bit of the bi-implication is actually a bit strong, but if we simply assume what there is to be assumed, then we can derive that $X$ and $Y$ are independent, by simply taking a special case (it's true for all those functions $f$ and $g$).2017-01-19