1
$\begingroup$

Consider the random p-vector $X$ passed through some non-linear function $g: \mathbb{R}^p \rightarrow \mathbb{R}^q$ so that we have $g(X)$. I would like to compute the $q \times q$ covariance matrix $\text{Cov}(g(X))$. Here is the caveat though: we only have access to $\text{Cov}(X)$ and $g$.

I am not sure if this is possible, but is there a way of recovering $\text{Cov}(g(X))$ from just $\text{Cov}(X)$ and $g$ (e.g., by using derivative info for $g$)? Obviously if $g$ is linear, then we have $\text{Cov}(A X) = A \text{Cov}(X) A^T$ for some real matrix $A$, but what if $g$ is non-linear?

  • 0
    IMHO, there is no closed formula. First natural idea: linearize g...2017-02-06
  • 0
    I am not sure, but this should be taken with grain of salt, that this is possible for general $g$. Does your $g$ have a functional form?2017-02-06

1 Answers 1

1

This is not possible, even in the simplest one dimensional case. If $X$ is a random variable that you know only its variance, you cannot know the variance of $g(X)$ unless $g$ is very special (linear, for example).

As an explicit example, take $X$ and $Y$ to be normally distributed with variance $1$, but with different expectation values, say $E(X)=0$ and $E(Y)=1$. Clearly, $X$ and $Y$ have the same covariance matrix (in this case, a $1\times 1$ matrix). Now consider $g(x)=x^2$. It is easy to see that $g(X)$ and $g(Y)$ have different variances (explicitly, $\operatorname{var}(X^2)=2$ and $\operatorname{var}(Y^2)=6$). Therefore, the variance of $X$ is not enough to determine the variance of $g(X)$.