7
$\begingroup$

Suppose that $X\sim\mathcal{N}\left(0,1\right)$ (i.e., $X$ is a standard normal random variable) and $a,b,$ and $c$ are some real constants. Does any of the following expectations have a closed-form?

  1. $\mathbb{E}\left[\log\Phi\left(aX\right)\right]$
  2. $\mathbb{E}\left[\Phi\left(aX\right)\log\Phi\left(aX\right)\right]$
  3. $\mathbb{E}\left[\Phi\left(aX\right)\Phi\left(bX+c\right)\right]$

I've tried the usual derivative trick followed by Stein's lemma, but the expressions didn't get much simpler. For the third one if $a=b$ and $c=0$ the closed-form solution is $$\mathbb{E}\left[\left(\Phi\left(aX\right)\right)^2\right]=\frac{\tan^{-1}\left(\sqrt{1+2a^2}\right)}{\pi}$$ but I couldn't reach at any generalization.

  • 0
    Is phi the normal density or the cdf?2012-07-16
  • 0
    Yes, $\Phi\left(\cdot\right)$ is the normal CDF.2012-07-16
  • 0
    There is no closed form to the normal cdf and so I don't expect there would be a closed form to 1 or 2 and probably not 3 either. Had it been the normal density the closed form solution to 1 would be very easy to get.2012-07-16
  • 0
    I didn't mean an analytical function by closed-form. It would be acceptable if the answers contain well-known function, like the normal CDF itself. As long as it is not another expectation or a similar integral, and it is simple enough I think it should be fine.2012-07-16
  • 0
    Your third expectation is $P\{Y \leq aX, Z \leq bX+c\}$ where $X, Y, Z$ are independent $N(0,1)$ random variables. The events $\{Y \leq aX\}$ and $\{Z \leq bX+c\}$ are not independent, but they are _conditonally independent_ given $X = x$ with conditional probabilities $\Phi(ax)$ and $\Phi(bx+c)$ respectively. Maybe looking at the problem where #3 arose will reveal a different method of attack.2012-07-17
  • 0
    Using your notation, that boils down to deriving the CDF of the jointly normal random variable $\left(Y-aX,Z-bX\right)$. I guess it cannot be simplified beyond that, unless there's some sort of decomposition of the bivariate normal CDF using univariate normal CDF. The problem arose when I wanted to compute $\Pr\lbrace yu < T\rbrace$ for some constant $T$ where $y$ is a $\pm 1$ Bernoulli RV conditioned on a random variable $v$ and $\left(u,v\right)$ is a jointly normal RV.2012-07-17
  • 0
    Got something from my answer below?2012-07-25
  • 0
    Your answer for the third expression with $c=0$ was certainly useful, but it seems that the general case with $c\neq 0$ doesn't have an analytical solution and it contains an integral anyways.2012-07-26
  • 0
    What is the conclusion of your last comment? (And please use the @ thing to notify comments, I only saw yours by chance although it is clearly written to me.)2012-08-11

1 Answers 1

5

Cases 1. and 2. when $a=\pm1$ follow from the following remark: for every suitable function $u$, $$ \mathrm E(u'(\Phi(X)))=\int_{-\infty}^{+\infty} u'(\Phi(x))\varphi(x)\,\mathrm dx=\left[u(\Phi(x))\right]_{x=-\infty}^{x=+\infty}=u(1)-u(0). $$ For example, $u(t)=t\log(t)-t$ yields $u'(t)=\log t$, $u(1)=-1$ and $u(0)=0$ hence $$ \mathrm E(\log\Phi(aX))=-1,\quad a=\pm1. $$ Likewise, $u(t)=\frac12t^2\log t-\frac14t^2$ yields $u'(t)=t\log t$, $u(1)=-\frac14$ and $u(0)=0$ hence $$ \mathrm E(\Phi(aX)\log\Phi(aX))=-\tfrac14,\quad a=\pm1. $$ About case 3., $u(t)=\frac13t^3$ yields $u'(t)=t^2$, $u(1)=\frac13$ and $u(0)=0$ hence $$ \mathrm E(\Phi(aX)^2)=\tfrac13,\quad a=\pm1. $$


Another approach is to differentiate with respect to the parameter $a$ and to use Stein's lemma. Consider $v(a)=\mathrm E(u(\Phi(aX)))$ for some suitable function $u$, then $v(0)=u(\frac12)$ and $$ v'(a)=\mathrm E(X\varphi(aX)u'(\Phi(aX)))=\mathrm E(Xg(X))$$ where $$ g(x)=\varphi(ax)u'(\Phi(ax)). $$ Stein's lemma and the identities $\varphi'(s)=-s\varphi(s)$ and $\Phi'(s)=\varphi(s)$ yield $$ v'(a)=\mathrm E(g'(X))=\mathrm E(-a^2X\varphi(aX)u'(\Phi(aX))+a\varphi(aX)^2u''(\Phi(aX))), $$ hence $$ v'(a)=-a^2v'(a)+a\mathrm E(\varphi(aX)^2u''(\Phi(aX))), $$ and $$ v'(a)=\frac1{1+a^2}\mathrm E(\varphi(aX)^2u''(\Phi(aX))). $$ If $u(t)=t^2$, $u''(t)=2$ hence one gets $v(0)=u(\frac12)=\frac14$ and $$ \mathrm E(\varphi(aX)^2)=\int_{-\infty}^{+\infty}\varphi(ax)^2\varphi(x)\,\mathrm dx=\frac1{2\pi}\int_{-\infty}^{+\infty}\varphi(\sqrt{1+2a^2}x)\,\mathrm dx=\frac1{2\pi\sqrt{1+2a^2}} $$ hence $$ v'(a)=\frac{a}{\pi (1+a^2)\sqrt{1+2a^2}}. $$ Integrating this, one gets $$ v(a)=\frac14+\frac1{\pi}\int_0^a\frac{x\mathrm dx}{(1+x^2)\sqrt{1+2x^2}}=\frac14+\frac1{\pi}\left[\arctan\sqrt{1+2x^2}\right]_{x=0}^{x=a}, $$ and finally, $$ \mathrm E(\Phi(aX)^2)=\frac1\pi\arctan\sqrt{1+2a^2}. $$ Likewise, $u(a)=\mathrm E(\Phi(aX)\Phi(bX))$ yields $$ u'(a)=\mathrm E(X\varphi(aX)\Phi(bX))=\mathrm E(Xg(X))$$ where $$ g(x)=\varphi(ax)\Phi(bx). $$ Stein's lemma and the formula $g'(x)=-a^2x\varphi(ax)\Phi(bx)+b\varphi(ax)\varphi(bx)$ yield $$ (1+a^2)v'(a)=b\mathrm E(\varphi(aX)\varphi(bX)), $$ hence $$ v'(a)=\frac{b}{2\pi (1+a^2)\sqrt{1+a^2+b^2}}. $$ Finally, $$ \mathrm E(\Phi(aX)\Phi(bX))=\frac1\pi\arctan\sqrt{1+2b^2}+\frac{b}{2\pi}\int_b^a\frac{\mathrm dx}{(1+x^2)\sqrt{1+x^2+b^2}}. $$ Amongst several equivalent formulations, this means that

$$ \mathrm E(\Phi(aX)\Phi(bX))=\frac14+\frac1{2\pi}\arctan\left(\frac{ab}{\sqrt{1+a^2+b^2}}\right). $$

The case $\mathrm E(\Phi(aX)\Phi(bX+c))$ might also be solvable with this method.

  • 0
    The case with non-zero $c$ gets an extra exponential term in the integral which makes it somewhat uglier. By the way, for sanity check, the one you have derived for $c=0$ should be symmetric in $a$ and $b$. Is it really symmetric?2012-07-17
  • 0
    Good point. Some numerical computations could help to check that the final formula is indeed symmetric. Did you try?2012-07-17
  • 0
    I think $\mathbb{E}\left[\Phi\left(aX\right)\Phi\left(bX\right)\right]=\frac{1}{4} + \frac{1}{2 \pi} \tan^{-1}\frac{ab}{\sqrt{1+a^2+b^2}}$ is the symmetric answer.2012-07-17
  • 0
    Nicer formula. I think you are right... hence I included it in my post.2012-07-17