25
$\begingroup$

What is the distribution of a random variable that is the product of the two normal random variables ?

Let $X\sim N(\mu_1,\sigma_1), Y\sim N(\mu_2,\sigma_2)$ and $Z=XY$

That is, what is its probability density function, its expected value, and its variance ?

I'm kind of stuck and I can't find a satisfying answer on the web. If anybody knows the answer, or a reference or link, I would be really thankful...

  • 1
    @Clara, I attempted to edit your post to make it more clear. Please let me know if anything was incorrect.2012-06-22

5 Answers 5

20

I will assume $X$ and $Y$ are independent. By scaling, we may assume for simplicity that $\sigma_1 = \sigma_2 = 1$. You might then note that $XY = (X+Y)^2/4 - (X-Y)^2/4$ where $X+Y$ and $X-Y$ are independent normal random variables; $(X+Y)^2/2$ and $(X-Y)^2/2$ have noncentral chi-squared distributions with $1$ degree of freedom. If $f_1$ and $f_2$ are the densities for those, the PDF for $XY$ is $ f_{XY}(z) = 2 \int_0^\infty f_1(t) f_2(2z+t)\ dt$

  • 0
    Because otherwise there is no answer: you need more information.2017-02-26
8

For the special case that both Gaussian random variables $X$ and $Y$ have zero mean and unit variance, and are independent, the answer is that $Z=XY$ has the probability density $p_Z(z)={\rm K}_0(|z|)/\pi$. The brute force way to do this is via the transformation theorem: \begin{align} p_Z(z)&=\frac{1}{2\pi}\int_{-\infty}^\infty{\rm d}x\int_{-\infty}^\infty{\rm d}y\;{\rm e}^{-(x^2+y^2)/2}\delta(z-xy) \\ &= \frac{1}{\pi}\int_0^\infty\frac{{\rm d}x}{x}{\rm e}^{-(x^2+z^2/x^2)/2}\\ &= \frac{1}{\pi}{\rm K}_0(|z|) \ . \end{align}

7

Given the densities $\varphi$ and $\psi$ of two independent random variables, the probability that their product is less than $z$ is $ \iint_{xy< z}\varphi(x)\psi(y)\,\mathrm{d}x\,\mathrm{d}y\tag{1} $ Letting $w=xy$ so that $x=w/y$ yields $ \iint_{w< z}\varphi\left(\frac{w}{y}\right)\psi(y)\,\mathrm{d}\frac{w}{y}\,\mathrm{d}y=\iint_{w< z}\varphi\left(\frac{w}{y}\right)\psi(y)\,\mathrm{d}w\,\frac{\mathrm{d}y}{y}\tag{2} $ Taking the derivative of $(2)$ with respect to $z$ gives the density of the product of the random variables to be $ \phi(z)=\int\varphi\left(\frac{z}{y}\right)\psi(y)\,\frac{\mathrm{d}y}{y}\tag{3} $ We can compute the expected value using this distribution as $ \begin{align} \mathrm{E}(Z) &=\int z\phi(z)\,\mathrm{d}z\\ &=\iint z\,\varphi\left(\frac{z}{y}\right)\psi(y)\,\frac{\mathrm{d}y}{y}\,\mathrm{d}z\\ &=\iint xy\,\varphi(x)\psi(y)\,\mathrm{d}y\,\mathrm{d}x\tag{4} \end{align} $ which is exactly what one would expect when computing the expected value of the product directly.

In the same way, we can also compute $ \begin{align} \mathrm{E}(Z^2) &=\int z^2\phi(z)\,\mathrm{d}z\\ &=\iint z^2\,\varphi\left(\frac{z}{y}\right)\psi(y)\,\frac{\mathrm{d}y}{y}\,\mathrm{d}z\\ &=\iint x^2y^2\,\varphi(x)\psi(y)\,\mathrm{d}y\,\mathrm{d}x\tag{5} \end{align} $ again getting the same result as when computing this directly.

The variance is then, as usual, $\mathrm{E}(Z^2)-\mathrm{E}(Z)^2$.

  • 0
    @Michael Chernick: The lowest price for a used copy at amazon is more than 400\$'s, and for a new copy the double of that!2017-04-19
4

To your first point, let's look at how we calculate the expectation of a product. The idea is easy enough but Gaussian distributions can look a little messier than they really are.

Specifically, to look at the distribution function, I would start here: http://en.wikipedia.org/wiki/Product_distribution

For expectation though, recall (or note):

For independent random variables, the joint probability distribution function, $h(x,y)$ can be found simply as the product of the marginal distributions, say $f(x)$ and $g(y)$.

That is $h(x,y)=f(x)*g(y)$. You find the expectation in the same way you would find it for a single variable with single pmf. Namely,

$E(XY)=E(Z)=\int\int xy*h(x,y)dydx$

=$\int\int xy (f(x)g(y)dydx)=[\int xf(x)dx][\int yg(y)dy]=E(X)E(Y)$

For standard normal RVs, this is simple to compute. If, in fact, your variables are not independent, then you need to incorporate a covariance term into your calculations.

Hope that helps.

  • 2
    ... and more generally, $E[X^m] = \sum_{k=0}^{\lfloor m/2 \rfloor} \dfrac{\mu_1^{m-2k} \sigma_1^{2k} m!}{2^k k! (m-2k)!}$ and similarly for $E[Y^m]$, so you can get all the moments.2012-06-22
3

It is called The Algebra of Random Variables by Melvin D. Spinger (Wiley, 1979) and includes a lot on products: http://www.amazon.com/Algebra-Variables-Probability-Mathematical-Statistics/dp/0471014060/ref=sr_1_1?s=books&ie=UTF8&qid=1340403029&sr=1-1&keywords=the+algebra+of+random+variables

In searching I also found this book by Galambos and Simonelli: http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Dstripbooks&field-keywords=product+of+random+variables