3
$\begingroup$

I have $E(X|Y,Z)=0$, $X$ independent of $Y$ and of $Z$ and I want to conclude that $E(X)=0$ ($X,Y,Z$ are real-valued random variables). Okay it seems quite obvious, but if I try to make a strict argument I find a missing step.

I tried to approach this from both sides:

  • From the independence follows, that $X$ is independent of $\sigma(Y) \cup \sigma(Z)$.

  • The first expression is, more exactly $=E(X| \sigma(Y,Z))$. $\sigma(Y,Z)=\sigma(\sigma(Y) \cup \sigma(Z))$.

It is known, that in general $\sigma(\mathcal{A} \cup \mathcal{B}) \neq \mathcal{A} \cup \mathcal{B} (*)$ for $\sigma$-algebras $\mathcal{A}, \mathcal{B}$. Question: How can I conclude $E(X)=0$ anyway?

Maybe for independent random variables the equality $(*)$ does hold? Or there is m maybe an alternative, more convenient expression for conditional expectations on more than one random variables?

  • 1
    For every integrable random variable X and every sigma-algebra G, one of the two defining properties which characterize Y=E(X|G) is that E(X;A)=E(Y;A) for every A in G (you know the other one). Use this for A=Omega.2011-08-04

2 Answers 2

3

Elaborating on the comment by Didier Piau. You don't even need an independence, since it's just simply a "tower rule": $ \mathsf E[\mathsf E[X|\mathcal F]] = \mathsf EX $ under some conditions on the integrability. In your simple case by the definition of the conditional expectation you have $ \mathsf E[\eta\cdot \mathbf 1_A] = \mathsf E[X \mathbf 1_A] $ if $A\in \sigma(Y,Z)$. Here $\eta = \mathsf E [X|Y,Z]$.

You know from the definition of $\sigma$-algebra that $\Omega\in \sigma(Y,Z)$ so $\mathbf 1_\Omega = 1$ and you have $ 0 = \mathsf E[\eta] = \mathsf E[X] $

Edited: answering your last comment. First, an example which justifies your intuition. Let $ X = \begin{cases}1,\quad p=0.5,\\ 0,\quad p = 0.5.\end{cases} $ and $A\subset\Omega = \{X = 0\}$, $\mathsf P(A) = 0.5$. By the definition of conditional expectation we have $ \mathsf E[X|A] = \int\limits_\Omega X(\omega)\mathsf P(d\omega|A) = \frac{1}{\mathsf P(A)}\int\limits_A X(\omega)\mathsf P(d\omega) = 0 $ while $\mathsf E[X] = 0.5$.

On the other hand, let us consider the random variable $\xi = \mathbf 1_A$. Clearly, $\mathcal F_\xi = \{\emptyset,A,A^c,\Omega\}$ and you would like to have the same result as before. But there is a slightly difference. $ \mathsf E[X\mid\mathcal F_\xi] = 1-\xi $ and we cannot say that $1-\xi = 0 \quad\mathsf P$-a.s. That's way you don't have that $\mathsf E[X] = \mathsf E[X\mid\mathcal F_\xi] = 0$.

Roughly speaking, the condition $\mathsf E[X\mid \mathcal F_\xi] = 0$ says that it will hold independently of the value of $\xi$ - so in all cases which can be characterized by $\xi$.

  • 0
    @Johannes L: here you are.2011-08-05
0

See law of total expectation: $ E(E(X \mid Y)) = E(X). $ So that does it.