88
$\begingroup$

Say I have $X \sim \mathcal N(a, b)$ and $Y\sim \mathcal N(c, d)$. Is $XY$ also normally distributed?

Is the answer any different if we know that $X$ and $Y$ are independent?

  • 25
    For a negative answer to your first question, take X=Y, then XY=X^2 cannot be Gaussian since it only takes positive values2012-01-21
  • 5
    no: http://mathworld.wolfram.com/NormalProductDistribution.html2013-06-17
  • 4
    how does this question related to this paper: http://www.tina-vision.net/docs/memos/2003-003.pdf which says that "product and the convolution of Gauss ian probability density functions (PDFs) are also Gaussian functions".2014-12-28
  • 4
    @AsadIqbal They are related by a confusion between the product of some independent random variables and the product of their PDFs. A random variable product of two independent gaussian random variables is not gaussian except in some degenerate cases such as one random variable in the product being constant. A product of two gaussian PDFs is proportional to a gaussian PDF, always, trivially. Idem for the convolution of PDFs.2015-06-23

5 Answers 5

4

As pointed out by Davide Giraudo, the characteristic function of Z is $\varphi_Z (t) = \frac{1}{\sqrt{1+t^2}}$. So, the corresponding probability density is given by: $$\eqalignno{f_Z(z)&={1\over2\pi}\int_{-\infty}^{+\infty}e^{-itz}\varphi_Z(t)\,dt={1\over2\pi}\int_{-\infty}^{+\infty}{\cos zt-i\,{\rm sen}\ zt\over\sqrt{1+t^2}}\,dt\cr &={1\over2\pi}\int_{-\infty}^{+\infty}{\cos zt\over\sqrt{1+t^2}}\,dt-{i\over2\pi}\int_{-\infty}^{+\infty}{{\rm sen}\ zt\over\sqrt{1+t^2}}\,dt.}$$ The last of these integrals is null because the integrand is an odd function. Therefore, ${\Im[f_Z(z)]=0}$ and we have $$f_Z(z)={1\over\pi}\int_0^{+\infty}{\cos zt\over\sqrt{1+t^2}}\,dt.$$ For $z > 0$ this integral converges and represents the zero order modified Bessel function of second species, usually denoted as $K_{0}(z)$. Thus, $$f_Z(z)={1\over\pi}K_0(|\,z\,|), \qquad |\,z\,|>0$$ This shows that the random variable $Z$ has a K-form Bessel distribution. For more information see, for instance, Johnson and Kotz. Distributions in Statistics --- Continuous Univariate Distributions. Boston, Houghton-Mifflin, 1970. See also https://en.wikipedia.org/wiki/K-distribution.

95

The product of two Gaussian random variables is distributed, in general, as a linear combination of two Chi-square random variables:

$$ XY \,=\, \frac{1}{4} (X+Y)^2 - \frac{1}{4}(X-Y)^2$$

Now, $X+Y$ and $X-Y$ are Gaussian random variables, so that $(X+Y)^2$ and $(X-Y)^2$ are Chi-square distributed with 1 degree of freedom.

If $X$ and $Y$ are both zero-mean, then

$$ XY \sim c_1 Q - c_2 R$$

where $c_1=\frac{Var(X+Y)}{4}$, $c_2 = \frac{Var(X-Y)}{4}$ and $Q, R \sim \chi^2_1$ are central.

The variables $Q$ and $R$ are independent if and only if $Var(X) = Var(Y)$.

In general, $Q$ and $R$ are noncentral and dependent.

  • 0
    I am also working on the distribution of the inner-product of two random variables having a normal distribution. The different topics on the subject in this forum helped me a lot. Could you just give some references/proofs about your last sentence that the variables Q and R are independent if and only if Var(X)=Var(Y), cause I exactly faced this problem in my simulations (in signal processing applications). I could not understand empirically why my code was only working when Var(X)=Var(Y). Now it make sense with your last post. Thanks2013-11-29
  • 0
    Thanks, but I am stuck with finding the expectation of XY when both are gaussian distributed RV2014-03-27
  • 0
    thanks... what about the var(XY) ?2014-11-19
  • 2
    Random variables Q and R are independent if and only if (X+Y) and (X-Y) are. Now, these are linear functions of the vector (X,Y). It is a theorem in Multivariate Probability that two linear functions AU and BU of a Gaussian vector U are independent if an only if A.var(U).B^T = 0. This implies that (X+Y) and (X-Y) are independent if and only if Var(X) = Var(Y), as can be easily verified.2014-11-22
  • 0
    The independence of the differences is neat. Do you also have independence of $XY$ and $X^2$?2016-02-23
  • 0
    @UlissesBraga-Neto may I ask for a reference in multivariate theory where I can find this theorem?2017-07-16
39

As @Yemon Choi showed in the first question, without any hypothesis the answer is negative since $P(X^2<0)=0$ whereas $P(U<0)\neq 0$ if $U$ is Gaussian.

For the second question the answer is also no. Take $X$ and $Y$ two Gaussian random variables with mean $0$ and variance $1$. Since they have the same variance, $X-Y$ and $X+Y$ are independent Gaussian random variables. Put $Z:=\frac{X^2-Y^2}2=\frac{X-Y}{\sqrt 2}\frac{X+Y}{\sqrt 2}$. Then $Z$ is the product of two independent Gaussian, but the characteristic function of $Z$ is $\varphi_Z(t)=\frac 1{\sqrt{1+t^2}}$, which is not the characteristic function of a Gaussian.

  • 5
    This proves that there are cases where X and Y are independent and XY isn't Gaussian, and cases where X and Y are non-independent and XY isn't Gaussian. However, there is also the question of whether cases exist in which XY is Gaussian (which would mean that X and Y would have to be non-independent). It wouldn't surprise me if such cases did exist, especially if the joint density of X and Y is allowed to be a generalized function. (One also needs to rule out the pathological case where a Dirac delta function could be considered to be a Gaussian with zero variance.)2012-01-21
10

You can use moments to see that the product $XY$ of independent normals cannot be normal except in trivial cases. By trivial, I mean $\mathbb{V}(X)\mathbb{V}(Y)=0.$

Suppose that $X,Y$ are independent normals so that $XY$ is normal.

Case 1: Suppose that $\mathbb{E}(X)=0$. By independence, $\mathbb{E}(XY)=\mathbb{E}(X)\mathbb{E}(Y)=0$, so $XY$ is mean zero normal, and hence $$\mathbb{E}((XY)^4)=3\mathbb{E}((XY)^2)^2.$$ By independence we get
$$\mathbb{E}(X^4)\mathbb{E}(Y^4)=3\mathbb{E}(X^2)^2\mathbb{E}(Y^2)^2.$$ Either $\mathbb{V}(X)=0$, or dividing by $\mathbb{E}(X^4)$ gives $ \mathbb{E}(Y^4)= \mathbb{E}(Y^2)^2.$ This shows that $Y^2$, and hence $Y$, has zero variance.

Case 2: Suppose that $\mathbb{E}(X^2)>0$ and $\mathbb{E}(Y^2)>0$. Then, without loss of generality, $\mathbb{E}(X^2)=1$ and $\mathbb{E}(Y^2)=1$. In this case, we also have $\mathbb{E}((XY)^2)=1$, so $$\begin{eqnarray} \mathbb{E}(X^3)&=&\mathbb{E}(X)(3-2\mathbb{E}(X)^2)\\ \mathbb{E}(Y^3)&=&\mathbb{E}(Y)(3-2\mathbb{E}(Y)^2)\\ \mathbb{E}((XY)^3)&=&\mathbb{E}(XY)(3-2\mathbb{E}(XY)^2) \end{eqnarray} $$ Subtracting the product of the first two lines from the third line gives $$0=6\mathbb{E}(X)\,\mathbb{E}(Y)\,\mathbb{V}(X)\,\mathbb{V}(Y).$$ Either we are in a trivial case, or back to Case 1.

Thus, the product cannot be normal except in trivial cases.

5

For a general answer to your question please refer to the Wishart distribution.

  • 0
    I don't think so. Wishart is the product distribution of Gamma distributions. However the Normal distribution is not Gamma (Chi-Squared is), and the type of product taken here of Gamma distributions is a different one to the one meant in the question.2018-12-10