1
$\begingroup$

Assume $X,Y$ to be random variables defined on $(\Omega,\Sigma,P)$, $Y = aX + b$ and that $X$ is Gaussian. I wanted to derive the conditional p.d.f. defined by:

$f_{Y|X}(y|x) := \frac{f_{X,Y}(x,y)}{f_X(x)}$

It is easy to show that $E[Y|X] = Y$ a.s. and also if $g(x) = \int_\mathbb{R} yf_{Y|X}(y|x)dy$ then $(g \circ X)(\omega) = g(X(\omega))$ is also a version of the conditional expectation (This is shown in the textbook titled "Probability With Martignales" by David Williams). Therefore,

$g(X(\omega)) = aX(\omega) + b\text{ a.s.}$

Is it possible to derive $f_{Y|X}(y|x)$ from the above relation? If not, what is the best way to derive this?

Any help is much appreciated. Thanks

  • 0
    $Y = aX+b$ is either an identity: $Y(\omega) = aX(\omega) + b$ for all $\omega \in \Omega$ or holds almost surely ($Y(\omega) = aX(\omega) + b$ for almost all $\omega \in \Omega$) depending on what you mean when you write $Y = aX+b$. Thus, given that $X(\omega) = x$, either $Y(\omega) = ax+b$ in which case $P\{Y=ax+b|X=x\} = 1$ or $Y(\omega) = ax+b$ for almost all $\omega \in \Omega$ and so $P\{Y=ax+b|X=x\} = 1$ once again. Thus, given $X=x$, either $Y$ is a constant $ax+b$, or almost surely the constant $ax+b$ and in either case, $P\{Y=ax+b|X=x\} = 1$.2012-03-21

0 Answers 0