Suppose X and Y are independent random variables, and let f and g be measurable functions, which are either bounded or non-negative.
The problem is to show that:
$E(f(X)g(Y))=E(f(X))E((gY))$
but I am trying to do this without the use of Fubini's theorem, and by building up from the case were X and Y are each simple.
I am extremely weak for some reason on simple functions, RVs, etc. for some reason.
So far all I have is that the independence notion allows for
$P(\{X\in A\}\cap\{Y\in B\})=P(X\in A)P(Y\in B)$, $A,B\in\mathcal{B}(\mathbb{R})$
This is my attempt at a solution:
Take $B_1,B_2\in\mathcal{B}$ and define $f=:1_{B_1},g=:1_{B_2}$ \begin{align*} \int_{\Omega}1_{B_1}X(\omega)1_{B_2}Y(\omega)dP(\omega)&=\int_{\Omega}1_{B_1}X(\omega)dP(\omega)\int_{\Omega}1_{B_2}Y(\omega)dP(\omega)\\ \int_{\Omega}1_{B_1\times B_2}(X(\omega),Y(\omega))dP(\omega)&=\\ &=P((X\in B_1)\cap(Y\in B_2))\\ &=P(X\in B_1)P(Y\in B_2) \end{align*}
Now, by linearity, define $h=: \sum b_i 1_{B_i}, k=: \sum c_i 1_{C_i}$ then
$\begin{align*} \mathbb{E}(h(X)j(Y))&=\mathbb{E}(\sum b_i 1_{B_i}(X)\sum c_i 1_{C_i}(Y))\\ &=\sum_{ij}b_ic_j \mathbb{E}(1_{B_i}(X)1_{C_j}(Y))\\ &=\sum_{ij}b_ic_j \mathbb{E}(1_{B_i}(X))\mathbb{E}(1_{C_j}(Y))\\ &=\mathbb{E}(h(X)j(Y)) \end{align*}$
Possible problems with the solution
This does not consider the unbounded case...how can I extend to that?
Also, B_1, B_2, etc. are not members of the sigma-field for the sample space (1_{B_i} is not being applied to \omega). What sigma-field are they in instead?
Help, pointers, detail is always appreciated. Thanks!