2
$\begingroup$

Suppose a random variable J is joint distributed between X and Y. Then,

$ E(J) = \int_{-\infty}^\infty \int_{-\infty}^\infty f(x, y) dxdy $

However, how do I calculate the variance of $J$? Had $J$ been a non-jointly distributed random variable, then we can use $Var(J) = E(J^2) - E(J)^2$. However, what is $E(J^2)$ in this case?

By the law of the unconscious statistician, $E(J^2) = \int_{-\infty}^\infty \int_{-\infty}^\infty j^2 f(x, y) dxdy$.

What should I supply for $j^2$ though? $x^2$ or $y^2$?

  • 0
    For the mean, let $g(x,y)$ be the minimum of $x$ and $y$. Then we want $\iint g(x,y)f(x,y)\,dx\,dy$. For the variance, first calculate the expectation of $J^2$, which we get by replacing $g(x,y)$ by $(g(x,y))^2$.2012-11-04

2 Answers 2

3

The question has evolved in the comments, and now may be asking the following. Let $J$ be the minimum of $X$ and $Y$. What is the variance of $J$? We assume that the joint density function of $X$ and $Y$ is $f(x,y)$.

An answer goes as follows. Let $m(x,y)$ be the minimum of $x$ and $y$. Then $E(J)=\int_{-\infty}^\infty\int_{-\infty}^\infty m(x,y)f(x,y)\,dx\,dy.$ As for the variance, it is $E(J^2)-(E(J))^2$, and $E(J^2)=\int_{-\infty}^\infty\int_{-\infty}^\infty (m(x,y))^2f(x,y)\,dx\,dy.$

In evaluating the integrals, we probably will want to use the following strategy, which we illustrate with the integral for the mean of $J$. Divide the plane into two parts, the part below $y=x$ and the part above. Then our integral is the sum of the integrals over the two parts.

In the part with $y\lt x$, we have $m(x,y)=y$. So our integral over this part is $\int_{x=-\infty}^\infty\int_{y=-\infty}^x yf(x,y)\,dy\,dx.$ The integral over the part where $x\lt y$ is obtained in the same way, except for some minor changes. It is $\int_{y=-\infty}^\infty\int_{x=-\infty}^y xf(x,y)\,dx\,dy.$ Add these.

The integral for calculating $E(J^2)$ can be broken up in exactly the same way. Instead of integrating $yf(x,y)$ or $xf(x,y)$ over suitable regions, we will be integrating $y^2f(x,y)$ and $x^2f(x,y)$ over the same regions.

  • 0
    Thanks for being so thorough!2012-11-04
1

Let $\bar F:(x,y)\mapsto\mathbb P(X\geqslant x,Y\geqslant y)$ denote the complementary CDF of $(X,Y)$ and $G=-(\partial_x\bar F+\partial_y\bar F)$. Then $t\mapsto G(t,t)$ is the PDF of $J=\min\{X,Y\}$ hence $ \mathbb E(J)=\int tG(t,t)\mathrm dt,\qquad\mathbb E(J^2)=\int t^2G(t,t)\mathrm dt, $ from which the variance of $J$ follows.

In the case when $(X,Y)$ is independent, $\bar F(x,y)=\bar F_X(x)\bar F_Y(y)$ where $\bar F_X$ and $\bar F_Y$ are the complementary CDF of $X$ and $Y$ respectively, defined by $\bar F_X(x)=\mathbb P(X\geqslant x)$ and $\bar F_Y(y)=\mathbb P(Y\geqslant y)$. If furthermore the distributions of $X$ and $Y$ have densities $f_X$ and $f_Y$, then $\partial_x\bar F_X(x)=-f_X(x)$ and $\partial_y\bar F_Y(y)=-f_Y(y)$ hence $ G(x,y)=f_X(x)\bar F_Y(y)+f_Y(y)\bar F_X(x), $ and the formulas above apply, with $ G(t,t)=f_X(t)\bar F_Y(t)+f_Y(t)\bar F_X(t)=f_X(t)\mathbb P(Y\geqslant t)+f_Y(t)\mathbb P(X\geqslant t). $