Is there an analytical or approximate solution of the following integral?
$ \int_{-\infty}^{\infty}\int_{y-d}^{y+d}\exp\big(-{(x-\mu_1)^2}/{2\sigma^2}\big) \exp\big(-{(y-\mu_2)^2}/{2\sigma^2}\big) \,\mathrm{d}x\,\mathrm{d}y $
Is there an analytical or approximate solution of the following integral?
$ \int_{-\infty}^{\infty}\int_{y-d}^{y+d}\exp\big(-{(x-\mu_1)^2}/{2\sigma^2}\big) \exp\big(-{(y-\mu_2)^2}/{2\sigma^2}\big) \,\mathrm{d}x\,\mathrm{d}y $
You can solve this using the same approach used in this answer.
EDIT 1:
The iterated integral can be expressed simply as $ I = 2\pi \sigma ^2 \bigg[\Phi \bigg(\frac{{d - (\mu _1 - \mu _2 )}}{{\sqrt {2\sigma ^2 } }}\bigg) - \Phi \bigg(\frac{{ - d - (\mu _1 - \mu _2 )}}{{\sqrt {2\sigma ^2 } }}\bigg)\bigg], $ where $\Phi$ is the distribution function of the standard normal distribution (confirmed numerically). I'll provide the derivation later on.
EDIT 2: Before I give the derivation of this result, here are a few numerical results confirming it. Let $r_1 = r_1(d,\mu_1,\mu_2,\sigma)$ denote the approximation obtained for the iterated integral and $r_2 = r_2(d,\mu_1,\mu_2,\sigma)$ the approximation obtained for the simple expression of $I$ above. The following results were obtained: $ d=1.2, \mu_1=2.8, \mu_2=1.6, \sigma=1.9: r_1 = 7.125001170659572 , r_2 = 7.125000629782468 $ $ d=1.9, \mu_1=2.4, \mu_2=-4.6, \sigma=1.5: r_1 = 0.11438563902950714 , r_2 = 0.11438586127326182 $ $ d = 3.2, \mu_1=-0.4, \mu_2=8.1, \sigma=2.9: r_1 = 5.070682528645151 , r_2 = 5.070686434762869 $ $ d=5.2, \mu_1=-1.4, \mu_2=-3.1,\sigma=0.8: r_1 = 4.017263196283472 , r_2 = 4.017262629040601. $
EDIT 3:
Proof: First write the iterated integral as $ I = 2\pi \sigma ^2 \int_{ - \infty }^\infty {\bigg[\int_{y - d}^{y + d} {f_X (x)\,dx} \bigg]f_Y (y)\,dy} , $ where $f_X$ and $f_Y$ are the probability density functions of the ${\rm N}(\mu_1,\sigma^2)$ and ${\rm N}(\mu_2,\sigma^2)$ distributions, respectively. Hence, $ I = 2\pi \sigma ^2 \int_{ - \infty }^\infty {{\rm P}( - d \le X - y \le d)f_Y (y)\,dy} , $ where $X \sim {\rm N}(\mu_1,\sigma_2)$. In turn, by the law of total probability, $ I = 2\pi \sigma ^2 {\rm P}( - d \le X - Y \le d), $ where $Y \sim {\rm N}(\mu_2,\sigma^2)$ and is independent of $X$. Finally, since $X-Y \sim {\rm N}(\mu_1 - \mu_2,2\sigma^2)$, $ I = 2\pi \sigma ^2 {\rm P}\bigg(\frac{{ - d - (\mu _1 - \mu _2 )}}{{\sqrt {2\sigma ^2 } }} \le Z \le \frac{{d - (\mu _1 - \mu _2 )}}{{\sqrt {2\sigma ^2 } }}\bigg), $ where $Z$ is a standard normal random variable. The result is thus established.
In response to OP's comment, I elaborate here on the equality $ {\rm P}( - d \le X - Y \le d) = \int_{ - \infty }^\infty {{\rm P}( - d \le X - y \le d)f_Y (y)\,dy}. $ A particular case of the law of total probability allows us to compute the probability of event by conditioning on a suitable random variable, as follows: $ {\rm P}(A) = \int_{ - \infty }^\infty {{\rm P}(A|Z = z)f_Z (z)\,dz} , $ where $f_Z$ is the probability density function of $Z$. (See also p. 1 here; the law of total probability is a special case of the law of total expectation.) Now, letting $A$ be the event that $-d \leq X-Y \leq d$ and letting $Z=Y$, the above equation yields $ {\rm P}( - d \le X - Y \le d) = \int_{ - \infty }^\infty {{\rm P}(- d \le X - Y \le d|Y = y)f_Y (y)\,dy}. $ Since $X$ and $Y$ are independent, $ {\rm P}( - d \le X - Y \le d|Y = y) = {\rm P}( - d \le X - y \le d). $ The desired equality is thus established.