5
$\begingroup$

I am given the parameters for a bivariate normal distribution ($\mu_x, \mu_y, \sigma_x, \sigma_y,$ and $\rho$). How would I go about finding the Var($Y|X=x$)? I was able to find E[$Y|X=x$] by writing $X$ and $Y$ in terms of two standard normal variables and finding the expectation in such a manner. I am unsure how to do this for the variance.

Also, how do I find the probability that both $X$ and $Y$ exceed their mean values (i.e., $P(X>\mu_x, Y > \mu_y)$)?

Thanks for the help!

  • 0
    what did you get for $\mathbb E[Y|X=x]$ ?2011-04-20
  • 0
    Well I was given numbers where mew x and mew y are 2 and negative 1 respectively. Variance of X is 4, variance of y is 1, and rho is negative root 3. I then expressed X as X = 2Z1+ 2 and Y as -root3/2*Z1 + 1/2*Z2 - 1 where Z1 and Z2 are standard normals and found E[Y|X=x] to be - root 3 / 2 * (x-2)/2 - 1.2011-04-20
  • 0
    You said $\rho$ is $-\sqrt 3$? That's impossible.2011-04-20
  • 0
    My bad its -(root3)/2.2011-04-20
  • 0
    I got $\mathbb E[Y|X=x]=-\frac{\sqrt 3}4(x-2)+1$ not $-1$.2011-04-20
  • 0
    I must have made an arithmetic mistake then! So now how do I go about finding Var(Y|X=x)?2011-04-20
  • 0
    I used a different approach. I'll type it up as a solution. I don't know how to continue with your approach.2011-04-20
  • 0
    Thank you. Much appreciated!2011-04-20
  • 0
    Sorry, you are right. Your $\mu_x=-1$ and I misread $1$.2011-04-20

2 Answers 2

10

Rather than embarking on some pretty involved computations of conditional distributions, one should rely on one of the main assets of Gaussian families, namely, the...

Key feature: In Gaussian families, conditioning acts as a linear projection.

Hence, as the OP suggested, one could do worse than to start from a representation of $(X,Y)$ by standard i.i.d. Gaussian random variables $U$ and $V$, for example, $$ X=\mu_x+\sigma_xU\qquad Y=\mu_y+\sigma_y(\rho U+\tau V)$$ where the parameter $\tau$ is $$\tau=\sqrt{1-\rho^2} $$ Since $\sigma_x\ne0$, the sigma-algebra generated by $X$ is also the sigma-algebra generated by $U$ hence conditioning by $X$ or by $U$ is the same. Furthermore, constants and functions of $X$ or $U$ are all $U$-measurable while functions of $V$ are independent on $U$, thus, $$ \mathrm E(Y\mid X)=\mu_y+\sigma_y(\rho U+\tau \mathrm E(V))=\mu_y+\sigma_y \rho U $$ which is equivalent to $$ \color{red}{\mathrm E(Y\mid X)=\mu_y+\rho\frac{\sigma_y}{\sigma_x}(X-\mu_x)} $$ Likewise, when computing conditional variances conditionally on $X$, deterministic functions of $X$ or $U$ should be considered as constants, hence their conditional variance is zero, and functions of $V$ are independent on $X$, hence their conditional variance is their variance. Thus, $$ \mbox{Var}(Y\mid X)=\mbox{Var}(\sigma_y\tau V\mid X)=\sigma_y^2\tau^2\mbox{Var}(V\mid X)=\sigma_y^2\tau^2\mbox{Var}(V) $$ that is, $$ \color{red}{\mbox{Var}(Y\mid X)=\sigma_y^2(1-\rho^2)} $$ Finally, the event $$A=[X>\mu_x,Y>\mu_y]$$ is also $$ A=[U>0,\rho U+\tau V>0]. $$ To evaluate $\mathrm P(A)$, one can turn to the planar representation of couples of independent standard Gaussian random variables, which says in particular that the distribution of $(U,V)$ is invariant by rotations. The event $A$ means that the direction of the vector $(U,V)$ is between the angle $\vartheta$ in $(-\pi/2,\pi/2)$ such that $$\tan(\vartheta)=-\rho/\tau$$ and the angle $\pi/2$. Thus, $$\mathrm P(A)=\frac{\pi/2-\vartheta}{2\pi}$$ that is, $$ \color{red}{\mathrm P(X>\mu_x,Y>\mu_y)=\frac14+\frac1{2\pi}\arcsin\rho} $$ Numerical application: If $\mu_x=2$, $\mu_y=-1$, $\sigma_x=2$, $\sigma_y=1$ and $\rho=-\sqrt3/2$, then $$ \mathrm E(Y\mid X)=-1+\sqrt3/2-(\sqrt3/4)X\qquad \mbox{Var}(Y\mid X)=1/4 $$ and $\tau=1/2$, hence $\vartheta=\pi/3$ and $$\mathrm P(A)=1/12$$

  • 0
    How can I calculate E[X^2|Y=y], E[Y^2|Y=y] or E[XY|Y=y] for a bivariate normal distribution? Can you share a link that does this? everything I see only has E[X|Y=y] Thanks2018-01-27
  • 0
    @MonaJalal Then you did no read carefully enough since a conditional variance is indeed computed above. Note also that the two latter conditional expectations in your comment are trivial, for example asking for $E(Y^2\mid Y=y)$ is strange, to say the least.2018-01-27
  • 0
    I asked because I expanded E[(X-Y)^2 | Y=y]. Basically how would you find the value of E[(X-Y)^2 | Y=y] for a bivariate normal distribution?2018-01-27
  • 0
    @MonaJalal To compute $E((X-Y)^2\mid X)$, I would use the $(U,V)$-representation in my post. To compute $E((X-Y)^2\mid Y)$, I would first compute a similar $(U,V)$-representation with $X$ and $Y$ exchanged.2018-01-27
2

First, the joint PDF $f(x,y)$ is obvious, just plug in your parameters. Bivariate Normal. Then you can find the marginal density for $X$, which gives you the conditional density of $Y$ given $X=x$: $$f_{Y|X}(y|x)=\frac{f(x,y)}{f_X(x)}.$$ Now use the conditional density you can evaluate both conditional expectation and conditional variance : $$\mathbb{E} (Y|X=x)=\int_{-\infty}^\infty y f_{Y|X}(y|x)dy,$$ and $$\text{Var} (Y|X=x)=\int_{-\infty}^\infty (y-h(x))^2 f_{Y|X}(y|x)dy=\frac14,$$ where $h(x)=\mathbb{E} (Y|X=x)=-\frac{\sqrt 3}4(x-2)-1$.

And with the joint PDF, $P(X>\mu_x, Y > \mu_y)$ is just an integration: $$P(X>\mu_x, Y > \mu_y)=\int_{\mu_x}^\infty\int_{\mu_y}^\infty f(x,y)dydx=\frac1{12},$$ though I guess there's an easier way to compute.

  • 0
    Doesn't this seem a bit too tedious? The integration is quite nasty given the horrific looking density... Is there no way to neatly solve Var(Y=-root(3)/2*Z1 + 1/2Z2 - 1 | Z1 = (x-2)/2)? With expectations you are allowed to split up the above statement due to linearity of it. Is there any way to do this with variance?2011-04-20
  • 0
    It is complicated. Maybe your approach is simpler. BTW, the conditional variance is $1/2$ according to Mathematica2011-04-20
  • 0
    hi, I am also working on bivariate, could you tell me how did you get $\frac{1}{12}$ in the end? Thanks much!2013-11-18