1
$\begingroup$

If $X_1, ..., X_n$ are normal random variables, will $[X_1, ..., X_n]$ be a normal random vector?

I know if $X_1, ..., X_n$ are further independent, then $[X_1, ..., X_n]$ is a normal random vector. Can there be more relaxed condition on $X_1, ..., X_n$ than being independent?

Thanks in advance!

3 Answers 3

4

$[X_1,...,X_n]$ is a normal (or Gaussian) random vector is the same as saying that those variables are jointly normal -or that it is a multivariate Gaussian. That each $X_i$ is normal (i.e., the marginal distributions are normal) is necessary but not sufficient (easy to give counterexamples). That each $X_i$ is normal and they are independent is -as you say- sufficient, but not necessary. A general (necessary and sufficient) condition can be expressed in terms of a linear combination of normal iid scalar variables.

Specifically: given $Z_1 ... Z_n$ iid standard normals (mean 0 and variance 1), then $X = A Z + b$ (with A any square nonsingular matrix and b any fixed vector) is jointly normal - and this is fully general. From this comes the formula of the (general) multivariate Gaussian variable. Conversely, if $X$ is a normal random vector then one can find a matrix C and a vector d such that $Z = C X + d$ is a vector of iid standard normal variables. This comes to be a multivariate generalization of the well known formula to standardize a Gaussian: $z = \frac{x-\mu}{\sigma}$

  • 0
    @Ethan: Let $X_1 $ $X_2$ iid standard normal (so that they are jointly normal). Define $Y_1 Y_2$ with $Y_1 = X_1$ $Y_2 = |X_2| sign(X_1)$, then $Y_1 Y_2$ will have the same sign, each will marginally be a gaussian, but jointly not (graphically: take the 3D bell-shaped joint gaussian and remove the 2nd and 4th cuadrant).2011-11-10
3

$(X_1,\ldots,X_n)$ is a normal vector if and only if any linear combination $\sum\nolimits_{i = 1}^n {a_i X_i }$, $a_i \in \mathbb{R}$, is a univariate normal variable (here, constants are also regarded as normal variables).

Concerning the first question, let $X$ be an arbitrary Gaussian process indexed by $T$ (for example a Brownian motion or bridge, where $T=[0,\infty)$ or $T=[0,1]$, respectively). Then, by definition, $(X_{t_1},\ldots,X_{t_n})$ is a normal vector, for any $n \geq 1$ and $t_1,\ldots,t_n \in T$, but the components are (in general) dependent random variables.

  • 0
    @Ethan: Fourier transform.2011-06-29
2

To add a little to leonbloy's excellent answer, I would say that $\vec{X} = (X_1, \ldots, X_n)$ is called a Gaussian vector, equivalently that $X_1, \ldots, X_n$ are called jointly Gaussian random variables if $\vec{X} = \vec{Z}A + \vec{\mu}_X$ where $\vec{Z} = (Z_1, \ldots, Z_k)$ is a vector of standard (zero-mean, unit variance) independent Gaussian random variables, $1 \leq k \leq n$, $A$ is a $k \times n$ matrix, and $\vec{\mu} = E[\vec{X}]$ is the mean vector of $\vec{X}$. This allows us to call $(X, Y) = (aZ+b, cZ+d)$ a Gaussian vector even though the joint density function $f_{X,Y}(x,y)$ cannot be expressed as the usual formula for the bivariate Gaussian density function, because this formula involves quantities of the form $(1 - \rho^2)^{-1}$ while here $\vert \rho\vert = 1$ since $X$ and $Y$ are perfectly correlated. As leonbloy points out, when $k = n$, the $n$-variate density formula for jointly Gaussian random variables can be obtained straightforwardly from this description.