2
$\begingroup$

I've got a uniform random variable $X\sim\mathcal{U}(-a,a)$ and a normal random variable $Y\sim\mathcal{N}(0,\sigma^2)$. I am interested in their sum $Z=X+Y$. Using the convolution integral, one can derive the p.d.f. for $Z$:

$f_Z(x)=\frac{1}{2a\sqrt{2\pi}}\int_{x-a}^{x+a}e^{-\frac{u^2}{2\sigma^2}}du=\frac{1}{2a}\left[\Phi\left(\frac{x+a}{\sigma}\right)-\Phi\left(\frac{x-a}{\sigma}\right)\right]$ where $\Phi(\cdot)$ is the normalized Gaussian cdf.

I am trying to evaluate $\int_{-\infty}^{\infty} f_Z^2(x)dx$ and $\int_{-\infty}^{\infty} f_Z^3(x)dx$. Are there bounds on these expressions in terms of elementary functions? Can they be expressed in terms of a finite sum involving $\Phi(\cdot)$?

  • 0
    Note that I use the natural log throughout (including in my definition of differential entropy) and not $\log_2$. But, you can adjust that easily, if needed. :)2011-11-21

2 Answers 2

2

It looks to me like $ \int_{-\infty}^\infty f_Z(x)^2\ dx = -{\frac {{\sigma}}{2{a}^{2}\sqrt {\pi }}}+\frac{1}{2a} {{\rm erf}\left({\frac {a}{{\sigma}}}\right)}+ \frac{\sigma}{2a^2 \sqrt{\pi}} {{\rm e}^{-a^2/\sigma^2}} $

EDIT: OK, here's the proof.

For convenience, scale distances so that $\sigma = 1$. We're looking at

$J =\frac{1}{8 \pi a^2} \int_{-\infty}^\infty dx \int_{x-a}^{x+a} ds \int_{x-a}^{x+a} dt\ e^{-(s^2+t^2)/2} $

Interchange the order of integration so this becomes

$\eqalign{ J &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_{s-2a}^{s+2a} dt \int_{\max(s,t)-a}^{\min(s,t)+a} dx\ e^{-(s^2+t^2)/2}\cr &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_{s-2a}^{s+2a} dt\ (2a + \min(s,t) -\max(s,t)) e^{-(s^2+t^2)/2} \cr}$

Break this up into two pieces, one where s and the other where $s>t$.

$ \eqalign{J_1 &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_s^{s+2a} dt\ (2a + s-t) e^{-(s^2+t^2)/2}\cr J_2 &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_{s-2a}^{s} dt\ (2a + t-s) e^{-(s^2+t^2)/2}\cr}$

In $J_1$, take $t = s+u$ (so that $-(s^2+t^2)/2 = -s^2 - us - t^2/2$); in $J_2$, take $t = s - u$ (so that $-(s^2+t^2)/2 = -s^2 +us - t^2/2$). With these changes of variables we recombine the integrals:

$ J = \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_0^{2a} du\ (2a - u) e^{-s^2} (e^{-us} + e^{us}) e^{-u^2/2} $

Interchange the order of integration again

$ \eqalign{J &= \frac{1}{8 \pi a^2} \int_0^{2a} du \int_{-\infty}^\infty ds\ (2a-u) e^{-s^2} (e^{-us}+e^{us}) e^{-u^2/2} \cr &= \frac{1}{2\sqrt{\pi} a^2} \int_0^{2a} du\ (a-u/2) e^{-u^2/4}\cr &= \frac{1}{2a} \text{erf}(a) + \frac{1}{2 \sqrt{\pi} a^2} e^{-a^2} - \frac{1}{2 \sqrt{\pi} a^2}\cr}$

  • 0
    Very nice proof!2012-04-26
0

I don't have a complete answer but just a suggestion for part of your question.

$f_Z$ is the convolution of a uniform density and a Gaussian density. Thus, its Fourier transform or characteristic function $\Psi_Z$ is the product of a Gaussian function and a sinc function. Parseval's theorem then gives us that $ \int (f_Z)^2 = \int |\Psi_Z|^2 $ (there may be a $2\pi$ or something similar that needs to be included in that equation depending on how the Fourier transform is defined). The integrand on the right hand side is the product of another Gaussian and a $\text{sinc}^2$ function, and the integral on the right might even have a closed form that is known already, or be more amenable to numerical integration than something that requires multiple computations of values of $\Phi(x)$.