2
$\begingroup$

I've got a uniform random variable $X\sim\mathcal{U}(-a,a)$ and a normal random variable $Y\sim\mathcal{N}(0,\sigma^2)$. I am interested in their sum $Z=X+Y$. Using the convolution integral, one can derive the p.d.f. for $Z$:

$$f_Z(x)=\frac{1}{2a\sqrt{2\pi}}\int_{x-a}^{x+a}e^{-\frac{u^2}{2\sigma^2}}du=\frac{1}{2a}\left[\Phi\left(\frac{x+a}{\sigma}\right)-\Phi\left(\frac{x-a}{\sigma}\right)\right]$$ where $\Phi(\cdot)$ is the normalized Gaussian cdf.

I am trying to evaluate $\int_{-\infty}^{\infty} f_Z^2(x)dx$ and $\int_{-\infty}^{\infty} f_Z^3(x)dx$. Are there bounds on these expressions in terms of elementary functions? Can they be expressed in terms of a finite sum involving $\Phi(\cdot)$?

  • 0
    Upper bound? Lower bound? What are you looking for? Note that by the mean-value theorem, you have that $\Phi((x+a)/\sigma) - \Phi((x-a)/\sigma) = \varphi(\xi)(2 a /\sigma)$ for some $\xi \in [(x-a)/\sigma,(x+a)/\sigma]$ where $\varphi = \Phi'$. Now $\varphi'$ doesn't change sign too many times, so you can use one endpoint or the other (in the appropriate places) to get a bound.2011-11-20
  • 0
    I'm interested in both upper and lower bounds. Good idea on the MVT, will try that, though I think earlier I tried it on a similar problem and didn't get a tight enough bound. Thanks for the help!2011-11-20
  • 0
    You can get lower bounds by using Renyi entropy as follows. Let $h(f) = - \mathbb E( \log f(X) )$ be the differential entropy. Then $\int_{-\infty}^\infty f^{\alpha+1}(x) \,\mathrm{d}x = \mathbb E f^\alpha(X)$. By Jensen's inequality, $\log( \mathbb E f^\alpha(X) ) \geq \alpha \mathbb E \log f(X) = - \alpha h(f)$. So $\mathbb E f^{\alpha}(X) \geq \exp(-\alpha h(f))$. I gave you some tight bounds on the Shannon entropy for this problem in a comment to another question of yours.2011-11-21
  • 0
    That's very neat (and works in other cases where you have a tight bound on the entropy of an otherwise painful distribution)! Thanks!2011-11-21
  • 0
    Note that I use the natural log throughout (including in my definition of differential entropy) and not $\log_2$. But, you can adjust that easily, if needed. :)2011-11-21

2 Answers 2

2

It looks to me like $$ \int_{-\infty}^\infty f_Z(x)^2\ dx = -{\frac {{\sigma}}{2{a}^{2}\sqrt {\pi }}}+\frac{1}{2a} {{\rm erf}\left({\frac {a}{{\sigma}}}\right)}+ \frac{\sigma}{2a^2 \sqrt{\pi}} {{\rm e}^{-a^2/\sigma^2}} $$

EDIT: OK, here's the proof.

For convenience, scale distances so that $\sigma = 1$. We're looking at

$$J =\frac{1}{8 \pi a^2} \int_{-\infty}^\infty dx \int_{x-a}^{x+a} ds \int_{x-a}^{x+a} dt\ e^{-(s^2+t^2)/2} $$

Interchange the order of integration so this becomes

$$\eqalign{ J &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_{s-2a}^{s+2a} dt \int_{\max(s,t)-a}^{\min(s,t)+a} dx\ e^{-(s^2+t^2)/2}\cr &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_{s-2a}^{s+2a} dt\ (2a + \min(s,t) -\max(s,t)) e^{-(s^2+t^2)/2} \cr}$$

Break this up into two pieces, one where $s<t$ and the other where $s>t$.

$$ \eqalign{J_1 &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_s^{s+2a} dt\ (2a + s-t) e^{-(s^2+t^2)/2}\cr J_2 &= \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_{s-2a}^{s} dt\ (2a + t-s) e^{-(s^2+t^2)/2}\cr}$$

In $J_1$, take $t = s+u$ (so that $-(s^2+t^2)/2 = -s^2 - us - t^2/2$); in $J_2$, take $t = s - u$ (so that $-(s^2+t^2)/2 = -s^2 +us - t^2/2$). With these changes of variables we recombine the integrals:

$$ J = \frac{1}{8 \pi a^2} \int_{-\infty}^\infty ds \int_0^{2a} du\ (2a - u) e^{-s^2} (e^{-us} + e^{us}) e^{-u^2/2} $$

Interchange the order of integration again

$$ \eqalign{J &= \frac{1}{8 \pi a^2} \int_0^{2a} du \int_{-\infty}^\infty ds\ (2a-u) e^{-s^2} (e^{-us}+e^{us}) e^{-u^2/2} \cr &= \frac{1}{2\sqrt{\pi} a^2} \int_0^{2a} du\ (a-u/2) e^{-u^2/4}\cr &= \frac{1}{2a} \text{erf}(a) + \frac{1}{2 \sqrt{\pi} a^2} e^{-a^2} - \frac{1}{2 \sqrt{\pi} a^2}\cr}$$

  • 0
    Any chance you could post how you arrived at that? Thanks!2012-04-24
  • 0
    Acgually I got it from Maclaurin series: it seems that the series is $$\sum_{k=0}^\infty \frac{(-1)^k}{(4k+2)(k+1)!\sqrt{\pi}} \frac{a^{2k}}{\sigma^{1+2k}}$$ But it should be possible to prove more directly.2012-04-24
  • 0
    Very nice proof!2012-04-26
0

I don't have a complete answer but just a suggestion for part of your question.

$f_Z$ is the convolution of a uniform density and a Gaussian density. Thus, its Fourier transform or characteristic function $\Psi_Z$ is the product of a Gaussian function and a sinc function. Parseval's theorem then gives us that $$ \int (f_Z)^2 = \int |\Psi_Z|^2 $$ (there may be a $2\pi$ or something similar that needs to be included in that equation depending on how the Fourier transform is defined). The integrand on the right hand side is the product of another Gaussian and a $\text{sinc}^2$ function, and the integral on the right might even have a closed form that is known already, or be more amenable to numerical integration than something that requires multiple computations of values of $\Phi(x)$.