5
$\begingroup$

Let be $X$ and $Y$ two not independent Gaussian random variables of means $\mu_X$, $\mu_Y$ and variances $\sigma_X$, $\sigma_Y$, respectively. Let also be $\Sigma$ the covariance between X and Y. I'd like a simple formula to $\mathbb{E}[X^2Y]$ and $\mathbb{E}[X^3Y]$ as a function of its means, variances and covariance.

Is there another way to compute that except the direct (and brutal) computation of the integral (using the appropriate density function, corresponding to the law of $X^2Y$ and $X^3Y$ respectively) ? I expect to obtain a simple formula similar to this one for the moments of a Gaussian random variable. It seems to have a smart and simple idea to avoid a direct computation of integrals.

1 Answers 1

13

If the crucial hypothesis (which is missing from your post) that $(X,Y)$ is jointly normal holds, a simple approach is to realize $X$ and $Y$ through i.i.d. standard normal random variables. Namely, $ X=\mu_X+\sigma_X\xi,\qquad Y=\mu_Y+\sigma_Y\cos(t)\xi+\sigma_Y\sin(t)\eta, $ where $(\xi,\eta)$ are i.i.d. standard normal random variables and the angle $t$ is such that $\sigma_X\sigma_Y\cos(t)=C_{X,Y}$ where $C_{X,Y}=\mathrm{Cov}(X,Y)$.

(Proof: Define the random variables $\xi$ and $\eta$ through the identities above and check that $\mathrm E(\xi^2)=\mathrm E(\eta^2)=1$ and $\mathrm E(\xi)=\mathrm E(\eta)=\mathrm E(\xi\eta)=0$. End of the proof.)

Then, one can express everything in terms of moments of $\xi$ and $\eta$ only. For example, $ X^2Y=(\mu_X^2+2\mu_X\sigma_X\xi+\sigma_X^2\xi^2)\cdot(\mu_Y+\sigma_Y\cos(t)\xi+\sigma_Y\sin(t)\eta), $ and $\mathrm E(\xi)=\mathrm E(\eta)=\mathrm E(\xi\eta)=\mathrm E(\xi^3)=0$ while $\mathrm E(\xi^2)=\mathrm E(\eta^2)=1$. Developing the product and identifying the expectation of each term yields $ \mathrm E(X^2Y)=\mu_X^2\mu_Y+2\mu_X\sigma_X\sigma_Y\cos(t)+\sigma_X^2\mu_Y, $ that is, $ \color{red}{\mathrm E(X^2Y)=\mu_X^2\mu_Y+2\mu_XC_{X,Y}+\sigma_X^2\mu_Y}. $ Sanity check: If $\mathrm{Cov}(X,Y)=0$, the result is $\mathrm E(X^2)\mathrm E(Y)$, as it should be.

You might want to compute $\mathrm E(X^3Y)$ using the same technique, an ingredient not used above being the numerical value $\mathrm E(\xi^4)=3$.

Edit: A similar approach, to compute every $\mathrm E(X^nY^k)$ at the same time, is to consider the function $\varphi$ defined by $ \varphi(u,v)=\mathrm E(\mathrm e^{uX+vY})=\sum\limits_{n\geqslant0}\sum\limits_{k\geqslant0}\frac{u^n}{n!}\frac{v^k}{k!}\mathrm E(X^nY^k). $ Since the random variable $uX+vY$ is normal with mean and variance $ m(u,v)=u\mu_X+v\mu_Y,\qquad s(u,v)^2=u^2\sigma_X^2+2uvC_{X,Y}+v^2\sigma_Y^2, $ one knows that $\varphi(u,v)=\exp(m(u,v)+\frac12s(u,v)^2)$. The task now is to develop this as a power series in $(u,v)$ and to identify the coefficient of $u^nv^k$. This leads to the formula $ \mathrm E(X^nY^k)=n!\,k!\,\sum_\ast\frac{\mu_X^i}{i!}\frac{\mu_Y^j}{j!}\frac{\sigma_X^{2r}}{2^rr!}\frac{\sigma_X^{2s}}{2^ss!}\frac{C_{X,Y}^t}{t!}, $ where the sum $\sum\limits_\ast$ is over every nonnegative $(i,j,r,s,t)$ such that $ i+2r+t=n,\qquad j+2s+t=k. $ For $(n,k)=(2,1)$, the possible values of $(i,j,r,s,t)$ are $ (2,1,0,0,0),\quad (0,1,1,0,0),\quad (1,0,0,0,1), $ which correspond respectively to the monomials $\mu_X^2\mu_Y,\quad \mu_Y\sigma_X^2,\quad \mu_XC_{X,Y}. $

  • 3
    Wh$e$n $t$he text of the exercise says so.2012-05-25