It is well-known fact, that if $X,Y$ are independent, integrable random variables then $E[Y|X]=E[Y]$. Next assume that $Y$ is centered and $E[Y|X]=0$. What reasonable conclusions can be made about the distributions of $Y$ and $X$? Particularly, I am interested in the case when $Y=X+Z$ and we know something about distribution of $X,Y,Z$ . Thanks.
If $E[Y|X]$ is constant, then what?
-
0This was not my question. – 2012-04-13
3 Answers
Recall that $\mathrm E(Y\mid X)=0$ is equivalent to the condition that $\mathrm E(Yu(X))=0$ for every measurable function $u$.
If $u\equiv1$, one gets $\mathrm E(Y)=0$, as was to be expected, hence it is unnecessary to assume that $Y$ is centered. Likewise, $\mathrm E(YX)=0$ and $\mathrm E(YX^n)=0$ for every integer $n$. And $\mathrm E(Y;X\leqslant x)=0$ for every $x$.
A canonical example such that $\mathrm E(Y\mid X)=0$ but $X$ and $Y$ are far from being independent is when $Y=\varepsilon v(X)$ with $v$ a measurable function and $\varepsilon=\pm1$ a centered Bernoulli random variable independent on $X$.
If $E[Y|X=x]=0$ for all $x$ in the support of $X$ then you know $E[Y]=0$ (though you knew that already). You also know that $E[XY] = 0$ , i.e. that $X$ and $Y$ are uncorrelated.
You may not know more than that; for example you might have something such as $Y$ having a conditional normal distribution with mean $0$ and variance $\exp(\sin(X))$.
If $Y=X+Z$ and $E[Y|X=x]=0$ for all $x$ in the support of $X$ then $E[Y|X=x]=E[Y|X=x]+E[Z|X=x]$, i.e. $E[Z|X=x]=-x$.
So you have $E[Z]=-E[X]$ if that exists, though that is also obvious since $E[Y]=0$. So the covariance of $X$ and $Z$ is $-\text{Var}[X]$.
The covariance is only well defined for square integrable random variables. If $X$ and $Y$ are square integrable and $E(Y\mid X)=0$ then $X$ and $Y$ are orthogonal in the Hilbert space of square integrable random variables, or equivalently uncorrelated, as
$E(YX)=E(E(YX\mid X))=E(XE(Y\mid X))=0$