0
$\begingroup$

If $(C,y) \mapsto \mathcal{L}_{X|Y}(C|y)$ is conditional distribution random variable $X$ given $Y$, then:

$$ E^Y(h(X)|y) = \int h(x) \mathcal{L}_{X|Y}(dx|y) $$

If we take $h = 1_C$, then holds: $E^Y(1(X \in C)|y) = \int 1(X \in C) \mathcal{L}_{X|Y}(dx|y) = \mathcal{L}_{X|Y}(C|y) $. But now i don't know how to prove for $h$ is simple non-negative function, then $h: S \to [0,\infty]$ and $h:S \to \mathbb{R}$? Thank you for any help.

1 Answers 1

0

As you already pointed out, the assertion holds if $h(x) = 1_C(x)$ is the indicator function of a measurable set $C$. By the linearity of the (conditional) expectation, this implies that

$$\mathbb{E}(h(X) \mid Y=y) = \int h(x) \mathcal{L}_{X \mid Y}(dx \mid y)$$

holds for any simple function $h$, i.e. any function $h$ of the form

$$h(x) = \sum_{j=1}^n a_j 1_{C_j}(x).$$

Now if $h \geq 0$ is a measurable function, then there exists a sequence of simple functions $(h_j)_{j \in \mathbb{N}}$ such that $h_j \geq 0$, $h_j(x) \uparrow h(x)$ as $j \to \infty$ for all $x$. Using the monotone convergence theorem (MCT) and the fact we already know that the assertion holds for simple functions we get

$$\begin{align*} \mathbb{E}(h(X) \mid Y) &\stackrel{\text{MCT}}{=} \lim_{j \to \infty} \mathbb{E}(h_j(X) \mid Y) \\ &= \lim_{j \to \infty} \int h_j(x) \mathcal{L}_{X \mid Y}(dx \mid Y) \\ &\stackrel{\text{MCT}}{=} \int h(x) \, \mathcal{L}_{X \mid Y}(dx \mid Y) \end{align*}$$

which shows that

$$\mathbb{E}(h(X) \mid Y=y) = \int h(x) \, \mathcal{L}_{X \mid Y}(dx \mid y).$$

Finally, if $h$ is a measurable function such that $h(X) \in L^1$, then we can write $h=h^+-h^-$ and apply the first part of the proof to positive part $h^+ \geq 0$ and negative part $h^- \geq 0$ and use again the linearity of the integral.

  • 0
    @user10 You are welcome. If you find the answer helpful, you can upvote/accept it by clicking on the up arrow/tick next to it.2017-02-18