4
$\begingroup$

Here is my story, I have the following function :
$$ g(x)=(1+x)\cdot\exp\left(-\frac{(\log(x+a)+c)^2}{2\sigma^2}\right)1[x\ge y]=f(x)\cdot1[x\ge y] $$
with $a,c,\sigma$ being "good" reals so that $g$ stays well defined and let $X_t$ be a geometric Brownian motion.

Here function $g$ is neither a convex function nor a difference of two convex functions because of the indicator function at $x=y$ where continuity is badly broken.

An "illegal" move then is to apply blindly Itô-Tanaka formula to $g(X_T)$ and get : $$ g(X_T)=g(X_0) +\int_0^T D^-g(X_t)dX_t+ \int_{\mathbb{R}}\Lambda_T(a)\mu(da) $$

Where $D^-$ is the left derivtive operator and $\mu$ is the "second derivative measure" (see for example theorem 7.1 page 218 in Karatzas and Shreve's book "Brownian Motion and Stochastic calculus").

Following this formula blindly I would get (weakly) :
$$ D^-g(x)=f'(x)\cdot1[x>y]+f(y)\cdot\delta_y(x) $$

Now getting $\mu$ seems to formally go like:
$$ \mu(dx)=f''(x)\cdot1[x\ge y]dx+f'(y)\cdot\delta_y(dx) +f(y)(\delta_y)'(dx) $$

So we get at the (probably wrong but appealing) formula : $$ \begin{align} g(X_T)&=g(X_0)+\int_0^T \left(f'(X_t)\cdot 1[X_t\ge y]+f(y)\delta_y(X_t)\right) dX_t+\int_y^{\infty}\Lambda_T(x)f''(x)dx \\ &+f'(y)\Lambda_T(y)-f(y)\partial_y\Lambda_T(y) \end{align} $$

Here many terms seem to be not well defined so I was deriving a heuristic (and terribly bad) calculation only to see where it was leading to. Anyway I am now wondering what is the correct result in this case.

When I say correct I mean that would make explicit the compensator of the $g(X_t)$ process using local time, because in the end I would like to take the expectation of $g(X_t)$, get rid of the local martingale parts and get the expectation of $g(X_t)$ in the form of the expectation of the compensator expressed in local time of $X_t$ + $g(X_0)$.

Best regards

PS: here the function $f$ was chosen as it seemed simple enough to be Itô differentiable but with reasonable properties so that expectation might exists.

  • 0
    I fixed formulas - hope that you're ok about it.2012-03-12
  • 0
    @ Ilya : No probs it seems fine to me, thank's for the fixes. Best regards2012-03-12
  • 0
    Maybe you could define $h(x)=g(x)+f(y)1_{\{x. Then $h$ should satisfy the conditions of Problem 6.24 in Karatzas and Shreve, and so you could do what you had in mind for $h$. Then $E[g(X_T)]=E[h(X_T)]-f(y)P(X_T < y)$.2012-03-12
  • 0
    @TheBridge: correct me if I am wrong, from the OP I understood that you are interested in $\mathsf E g(X_t)$. Since $X_t$ is a Markov process which semigroup I think is known, why don't you use this semigroup to find the expectation a-la Black-Scholes formula?2012-03-12
  • 0
    @ Ilya : Hi as a matter of fact I am looking for alternative way to think about this than classical edp from Markov formulation or "à la Black Scholes formula". This is why I want to express things from a local time point of view. Best regards.2012-03-12
  • 0
    @TheBridge: edp = pde?2012-03-13
  • 0
    @ Ilya: Sorry I used the french acronym it is indeed pde. Best Regards2012-03-13
  • 1
    @TheBridge: I see - the different use of adjectives leads to a nice duality :)2012-03-20

1 Answers 1

1

Let $h=g + f(y)1_{(-\infty,y)}$. Then $h$ satisfies the conditions of Problem 6.24 in Karatzas and Shreve, and so can be written as a difference of two convex functions. We therefore have \[ h(X_T) = h(X_0) + \int_0^T D^-h(X_t)\,dX_t + \int_{\mathbb{R}} \Lambda_T(x)\mu(dx). \] In this case, $D^-h = 1_{(y,\infty)}f'$ and \[ \mu(dx) = (1_{(y,\infty)}f'')(x)\,dx + f'(y)\delta_y(dx). \] Hence, \[ h(X_T) = h(X_0) + \int_0^T f'(X_t)1_{\{X_t>y\}}\,dX_t + \int_y^\infty \Lambda_T(x)f''(x)\,dx + f'(y)\Lambda_T(y). \] Since $h=g + f(y)1_{(-\infty,y)}$, this gives \begin{multline*} g(X_T) = g(X_0) + f(y) 1_{\{X_0 < y\}} - f(y)1_{\{X_T < y\}}\\ + \int_0^T f'(X_t)1_{\{X_t>y\}}\,dX_t + \int_y^\infty \Lambda_T(x)f''(x)\,dx + f'(y)\Lambda_T(y). \end{multline*}

  • 0
    @ user11867 : Very nice idea thanks.2012-03-12