1
$\begingroup$

The following is Sheldon Ross's definition:

We say that the random variables $X,Y$ have a bivariate normal distribution if, for some constants \mu_x,\mu_y,\sigma_x>0,\sigma_y>0, -1<\rho < 1, their joint density function is given, for all -\infty < x,y < \infty, by $f(x,y)=\frac{\exp\left(-\frac1{2(1-\rho^2)}\left(\left(\frac{x-\mu_x}{\sigma_x}\right)^2+\left(\frac{y-\mu_y}{\sigma_y}\right)^2-2\rho\frac{(x-\mu_x)(y-\mu_y)}{\sigma_x\sigma_y}\right)\right)}{2\pi\sigma_x\sigma_y\sqrt{1-\rho^2}}$

Is there a combinatorial/intuitive meaning of this definition?

3 Answers 3

3

I don't have a combinatorial meaning, but you can think of it as follows. $(X,Y)$ is the result of applying an affine transformation to a pair $(W,Z)$ of independent standard normal random variables. Many such transformations exist, and one in particular is

$\begin{align*} X &= \mu_x + \sigma_x W\\ Y &= \mu_y + \rho \sigma_y W + \sqrt{1-\rho^2} \sigma_y Z \end{align*}$

See for example this set of slides. The contours of the joint density (points at equal height above the $x$-$y$ plane) are ellipses centered at $(\mu_x,\mu_y)$.

  • 0
    @Didier Thanks for inserting the missing $\sigma_y$.2012-04-16
0

There are several equivalent definitions of a random vector being multivariate normal. Every characterization of a multivariate normal distribution is of course important. However sometimes it happens that one of the characterizations has an greater appeal from an intuitive standpoint.

One such characterization is the following:

One can show that a random vector $(X,Y)$ is bivariate normal iff $(a_1,a_2)(X,Y)$ is normal for every vector $(a_1,a_2)$.

This result generalizes to higher dimensions, i.e. for random vectors of dimension $n$.

The nice thing about this definition of a multivariate normal variable is that many results become almost trivial to prove. For example it follows immediately that if X is multivariate normal, any marginal distribution of X is (multivariate) normal.

  • 0
    That, or a Dirac mass at (0,0).2012-04-16
0

Another characterization of a multivariate normal is: a random vector ${\bf X}$ is multivariate normal iff its density has the form

$ f_{\bf X}({\bf x}) = \alpha \; e^{-g\,({\bf x})}$

with $g\,({\bf x})$ being a positive definite quadratic form (i.e., it can be written as $g\,({\bf x})={\bf x}^T {\bf A} {\bf x} + {\bf b}^T {\bf x} + c$ with ${\bf A}$ positive definite, ${\bf b}$ and $c$ arbitrary), and $\alpha$ the appropiate normalization constant. Applying this to the two dimensional case, and expressing the degrees of freedom as function of the probabilistic parameters $\mu_x$, $\mu_y$ , $\sigma_x^2$ $\sigma_y^2$ and $\rho$ you get the above formula.

  • 0
    @DilipSarwate: by degrees of freedom I simply mean parameters -as scalars; eg, an arbitrary quadratic in 2D has 6 parameters (3 for the matrix A, 2 for b , 1 for c) but one of them is absorbed into the normalizacion constant.2012-04-16