3
$\begingroup$

I have a fairly basic question that relates to understanding a particular derivation.

I have the following function $Q(x) = E\left[I(F(x+\varepsilon)>c)\right]$, where $x \in R$, $\varepsilon \sim N(0,\sigma^2)$ for a known $\sigma>0$, $c$ is a scalar constant, $F(\cdot)$ is a known, bounded function, and $I(\cdot)$ is an indicator function.

This can be written as:

$E[I(F(x+\varepsilon)>c)] = \frac{1}{{\sqrt{2\pi}}\sigma} \int{I(F(x+\varepsilon)>c) \exp\left(-\frac{\varepsilon^2}{2\sigma^2} \right)}d\varepsilon $

Let $y = x + \varepsilon$ and rewrite the above as:

$E[I(F(x+\varepsilon)>c)] = \frac{1}{{\sqrt{2\pi}}\sigma} \int{I(F(y)>c) \exp\left(-\frac{(y-x)^2}{2\sigma^2} \right)}dy$

I am interested in calculating $\frac{\partial Q(x)}{\partial x}$. The derivative is:

$\frac{\partial E[I(F(x+\varepsilon)>c)]}{\partial x}= \frac{1}{{\sqrt{2\pi}}\sigma} \int{I(F(y)>c) \exp\left(-\frac{(y-x)^2}{2\sigma^2} \right)} \frac{(y-x)}{\sigma^2}dy \quad \quad $ (1)

Substituting back in again and rearranging slightly:

$\frac{\partial E[I(F(x+\varepsilon)>c)]}{\partial x} = \frac{1}{\sigma^2} E[I(F(x+\varepsilon)>c)\varepsilon]$

Finally, here are my two questions: (1) is the above derivation correct? (2) If so, I am trying to understand why in eq(1) I do not need to take the derivative of $I(F(y)>c)$ with respect to $x$. I know there has been a change of variables, but $y$ is obviously a function of $x$. Yet I was told that "once you do the substitution, $y=x+\varepsilon$, then y is just a dummy variable (i.e. an index) which runs from -infinity to infinity, and one cannot take the derivative in a dummy variable." I know you can't take the derivative of an indicator function, but somehow I can't get this clear in my head.

Thanks.

More broadly, in my actual problem, $x$ and $\varepsilon$ are high dimensional and I am trying to compute the derivative via Monte Carlo integration....

1 Answers 1

1

This is true and might be recognized as the derivative of the function $x\mapsto\mathrm P(x+\varepsilon\in A)$, where $A=\{F\gt c\}$. Thus, one is considering $U(x)=\mathrm E(u(x+\varepsilon))$, for $u=\mathbf 1_A$ and the assertion is that $U'(x)=\sigma^{-2}\mathrm E(\varepsilon u(x+\varepsilon))$. This holds in full generality and the proof is as you explain.

Re your (2), the derivative of $\mathbf 1_{F(y)\gt c}$ with respect to $x$ exists and it is zero, simply because the function does not depend on $x$. Recall that, under suitable hypotheses, $ \frac{\mathrm d}{\mathrm dx}\int f(x,y)\,\mathrm dy=\int \frac{\partial f(x,y)}{\partial x}\,\mathrm dy. $

  • 0
    There is no $y(x,\varepsilon)$ here. Nowhere is $y$ a random variable, $y$ is a variable of integration and could be replaced by any symbol you like.2012-07-23