If a continuous random variable $X$ has a symmetric distribution around $0$, what is the conditional distribution of $X$ given $|X|$?
Conditional distribution of continuous random variable $X$ given $|X|$?
-
3Which definition of the conditional distribution do you know? – 2012-10-03
-
0disintegration. I guess the answer is $\frac{F_x(x)}{F_X(x)+ F_X(-x)}, but I need a formal proof. – 2012-10-03
-
0*disintegration*... Is this a definition? You might try to be more explicit. (And your guess is wrong.) – 2012-10-03
-
0@Martin : Don't confuse $F$ with $f$. In conventional notation, the former is the c.d.f. and the latter is the density. Your guess would be right if you had $f$ rather than $F$. – 2012-10-03
-
0It was just a typo. Thanks – 2012-10-03
-
0Please do **not** [simultaneously crosspost](http://meta.stackexchange.com/q/64068/158524) on multiple SE sites: http://stats.stackexchange.com/q/38567/2970 – 2012-10-04
2 Answers
For a rigorous answer one needs to rely on a precise definition of conditional distribution. Recall that, for any random variables $X$ and $Y$, one calls conditional distribution of $X$ conditionally on $Y$, any family $(\mu_y)_y$ of probability measures on the state space of $X$, indexed by the state space of $Y$, such that, for every bounded measurable function $u$, $$ \mathbb E(u(X)\mid Y)=L_u(Y)\ \text{almost surely},\quad\text{where}\quad L_u(y)=\int u(z)\mathrm d\mu_y(z). $$ In other words, one asks that, for every bounded measurable function $v$, $$ \mathbb E(u(X)v(Y))=\mathbb E(L_u(Y)v(Y)). $$
In the case at hand, $Y=|X|$ has density $g(y)=f_X(y)+f_X(-y)$ on $y\gt0$, hence $$ \mathbb E(L_u(Y)v(Y))=\int_{y\gt0}L_u(y)v(y)g(y)\mathrm dy. $$ On the other hand, $$ \mathbb E(u(X)v(Y))=\int u(x)v(|x|)f_X(x)\mathrm dx=\int_{y\gt0} (u(y)f_X(y)+u(-y)f_X(-y))v(y)\mathrm dy. $$ The only way these can coincide for every function $v$ is that, for Lebesgue almost every $y$, $$ L_u(y)g(y)=u(y)f_X(y)+u(-y)f_X(-y), $$ that is, $$ L_u(y)=\frac{u(y)f_X(y)+u(-y)f_X(-y)}{f_X(y)+f_X(-y)}. $$ Finally, the only way $L_u$ may be as above is when the family $(\mu_y)_y$ is such that, $\mathbb P_Y(\mathrm dy)$ almost surely, $$ \mu_y(\mathrm dx)=\frac{f_X(y)\delta_y(\mathrm dx)+f_X(-y)\delta_{-y}(\mathrm dx)}{f_X(y)+f_X(-y)}. $$ In particular, if the distribution of $X$ is symmetric, $f_X(y)=f_X(-y)$ hence, as you guessed, $$ \mu_y(\mathrm dx)=\frac12(\delta_y(\mathrm dx)+\delta_{-y}(\mathrm dx)). $$
A shortcut in the symmetric case is to note that $(X,|X|)$ and $(-X,|X|)$ coincide in distribution, hence $$ \mathbb E(u(X)\mid |X|)=\mathbb E(u(-X)\mid |X|)=\mathbb E(L_u(|X|)\mid |X|)=L_u(|X|), $$ where $L_u(y)=\frac12(u(y)+u(-y))$ for $y\gt0$. This yields directly that the measure $\mu_y$ is uniform on $\{-y,y\}$.
-
0Thanks. Good, liked both your answers. – 2012-10-04
Let $f_X(x)$ be the density function. The event that $|X|=x$ is the same as $X=\text{either }x\text{ or }-x$. So you have $$ \Pr(X=x\mid |X|=x)=\frac{f_X(x)}{f_X(x)+f_X(-x)}, $$ and $\Pr(X=-x\mid |X|=x)$ is complementary to that.
But you say it's symmetric about $0$, so that probability is $1/2$.
-
0Thanks. That's what I guess but typed wrongly above. How do you formally prove it? – 2012-10-03
-
1$\Pr(X=x)=0$. $ $ – 2012-10-03
-
0@did : Typo fixed. – 2012-10-03
-
0I think I may return to write a proof later...... – 2012-10-03
-
0@Did : Yes, it does say the distribution is continuous, so my argument is perhaps incomplete. The condional probability $\Pr(X=x\mid |X|)$ should be a random variable that is a function of $|X|$. – 2013-02-04
-
0$\Pr(X=x\mid |X|)=0$ fits the definition. – 2013-02-04