2
$\begingroup$

Given a random vector $X$ of $n$ normally distributed random variables, and an $n \times n$ covariance matrix of those variables with non-zero correlation terms, what is the generalized methodology to find the distribution of a non-linear function $f(X_1,X_2,\dots,X_n)$ of the random variables of $X$?

That's the general formulation of a problem I'm trying to solve. More specifically, given $6$ normally distributed random variables $x_1 \dots x_6$, what is the probability distribution of $\sqrt{(x_1 - x_4)^2 + (x_2 - x_5)^2 + (x_3 - x_6)^2}$ where $x_1,x_2,x_3$ are correlated and $x_4,x_5,x_6$ are correlated (i.e. the upper right and lower left $3 \times 3$ correlation terms are zero, but all other correlation terms are not).

  • 0
    Does "normally distributed" mean each one separately is normally distributed, or that that they are _jointly_ normally distributed? In the former case, you haven't given enough information; in the latter case, it wouldn't hurt to mention that.2011-09-06

2 Answers 2

1

Assuming the $X_i$ have mean 0, $(X_1 - X_4)^2 + (X_2 - X_5)^2 + (X_3 - X_6)^2$ (or any other quadratic form in the $X_i$) has a generalized chi-square distribution. See http://en.wikipedia.org/wiki/Generalized_chi-square_distribution. If $Y$ is a nonnegative random variable with density $f_Y(y)$, then $S = \sqrt{Y}$ has density $f_S(s) = 2 s f_Y(s^2)$.

  • 0
    @Craig: A random variable with a non-central chi-square distribution is not simply a constant plus a random variable with a central chi-square distribution. In particular, the lower boundary of the support of the probability distribution is still zero.2011-09-07
1

If the covariance matrix of the random column vector $X^T=(X_1,\ldots,X_n)^T$ is a nonsingular matrix $V$, and if they are jointly, not merely separately normally distributed, then $V$ has a positive-definite symmetric square root (found by first doing the spectral decomposition). Call that $V^{1/2}$. Then $V^{-1/2}X$ is normally distributed and its entries are not correlated, but in fact are independent, and all the variances are equal to 1. (If the variables are separately, but not jointly, normally distributed, then this transformation will still make the variances 1 and the covariances 0, but then one might not have independence, and there are other complications.) This reduces the problem to that of independent standard normals, provided the expected values are 0.

For quadratic forms, we would then have chi-square distributions or (if the means are not all 0) non-central chi-square distributions.