If two random variable $u_1$ and $u_2$ have a joint normal distribution what will be the distribution of the random variable $u_1-u_2$?
Distribution of two joint normal random variables
-
0Hint: convolution. – 2012-11-04
2 Answers
(Ok. My first exciting exercise in LaTeX... ^_^)
We're going to find distribution of $\xi=u_1-u_2$ in form of its density $p_{\xi}(x)$. There're two parts to solve here:
- Linearly transform given random vector $\bar u$ to another random vector $\bar v$, so that $\bar u=A\bar v$ with independent normal components $v_1$ and $v_2$. Then $\xi=\alpha v_1+\beta v_2$ for $\alpha=A_{11}-A_{21}$ and $\beta=A_{12}-A_{22}$.
- Find distribution of $\alpha v_1+\beta v_2$.
Density of joint normal distribution of $n$-vector $\bar u$ is $\frac{1}{(2\pi )^{n/2} \vert \Sigma \vert^{1/2}} e^{-\frac{1}{2}(\mathbf{u} - \mathbf{\mu})^{\top} \Sigma^{-1} (\mathbf{u} - \mathbf{\mu})}$ with mean vector $\bar \mu$ and covariance matrix $\Sigma$. Components of $\bar u$ are independent if $\Sigma$ is diagonal.
Let $C$ denote such linear transformation that $\Sigma=C^T\begin{pmatrix}\sigma_1^2&0\\0&\sigma_2^2\end{pmatrix}C$, (theory of quadratic forms describes how to find it). It's always possible because $\Sigma$ for non-degenerative distribution is positive definite, so its eigenvalues $\sigma_1^2$ and $\sigma_2^2$ are positive too. Then $\Sigma^{-1}=C^{-1}\begin{pmatrix}1/\sigma_1^2&0\\0&1/\sigma_2^2\end{pmatrix}C^{{-1}^T}$, i.e. substitution $\bar u=C^T\bar v$ will get $\Sigma^{-1}$ part of density function to its diagonal form for $\bar v$ with independent components. So, we have $p_{\bar v}(x,y)=p_1(x)p_2(y)$, where $p_1$ and $p_2$ are normal densities of $v_1$ and $v_2$ respectively.
Now let's find $p_{\xi}(z)=F'_{\xi}(z)=\displaystyle\frac{dP\{u_1-u_2\leq z\}}{dz}=\frac{dP\{\alpha v_1+\beta v_2\leq z\}}{dz}$. We should integrate joint density $p_{\bar v}$ over infinite area $\alpha x+\beta y\leq z$ then differentiate by $z$. Depending on $\beta$ it will be as:
$\beta>0=>\beta y \leq z-\alpha x=>y \leq\frac{z-\alpha x}{\beta}=>$ $F_\xi(z)=\iint\limits_{\alpha x+\beta y\leq z}p_1(x)p_2(y)dxdy=\int\limits_\mathbb R p_1(x)\int\limits_{-\infty}^{\frac{z-\alpha x}{\beta}}p_2(y)dydx=\int\limits_\mathbb R p_1(x)F_2(\frac{z-\alpha x}{\beta})dx => F'_\xi(z)=p_\xi(z)=\int\limits_\mathbb R \frac{1}{\beta}p_1(x)p_2(\frac{z-\alpha x}{\beta})dx$ or after substituting $\beta x$ to $x$: $p_\xi(z)=\int\limits_\mathbb R p_1(\beta x)p_2(\frac{z}{\beta}-\alpha x)dx.$ Similarly: $\beta<0=>\alpha x-z \leq -\beta y=>y \geq\frac{\alpha x-z}{-\beta}=\frac{z-\alpha x}{\beta}=>$ $F_\xi(z)=\iint\limits_{\alpha x+\beta y\leq z}p_1(x)p_2(y)dxdy=\int\limits_\mathbb R p_1(x)\int\limits^\infty_{\frac{z-\alpha x}{\beta}}p_2(y)dydx=\int\limits_\mathbb R p_1(x)(1-F_2(\frac{z-\alpha x}{\beta}))dx => F'_\xi(z)=p_\xi(z)=\int\limits_\mathbb R \frac{1}{\beta}p_1(x)p_2(\frac{z-\alpha x}{\beta})dx$ that equals again to the first case: $p_\xi(z)=\int\limits_\mathbb R p_1(\beta x)p_2(\frac{z}{\beta}-\alpha x)dx.$ If $\beta=0$ then the same logic can be repeated to $\alpha$, thus getting $p_\xi(z)=\int\limits_\mathbb R p_1(\frac{z}{\alpha}-\beta y)p_2(\alpha y)dy.$ P.S. Oh, of course I forgot about mean $\bar \mu$. Nevermind, it'll just produce some const during $u_1-u_2 \leq z=>\alpha v_1+\beta v_2+const \leq z$ transformation and this const will be applied to $z$ everywhere up to final result. I won't edit my expressions above as it'll be total overkill for me in such very first exercise after 4am on the clock... ^_^
Since $u_1$ and $u_2$ are jointly normal, their marginal distributions are normal too. So $u_1-u_2$ is normal with mean $E(u_1)-E(u_2)$ and variance $\text{Var}(u_1-u_2)=\text{Var}(u_1)+\text{Var}(u_2)-2\text{Cov}(u_1, u_2)$. All these information can be obtained from their joint distribution.