I'll enjoy your kindness to ask this question, despite that it seems silly for you. Please show me a document or a url; emphasizing in a concise manner, the relationship between Brownian motion and Laplacian. With google, I found a lot of links; but I can not get the link to Markov chain, heat kernel and half Laplacian. friendly.
relationship between Brownian motion and $1 / 2 \Delta$
1 Answers
Brownian motion is a Markov process, since $B_{t+s}-B_t\perp B_t$ and $B_{t+s}-B_t\sim\mathcal N(0,s)$. I do not know how familiar are you with the theory of Markov processes, but here is the nice connection which does asks for this theory - only for Ito's lemma. Let us consider $f\in C^2(\mathbb R)$, then $$ f(B_T) = \int\limits_0^Tdf(B_t) = \int\limits_0^Tf'(B_t)dB_t + \int\limits_0^T\frac12f''(B_t)dt $$ so by taking expectation of both sides $$ \mathsf E f(B_T) = \int\limits_0^T\frac12 \mathsf Ef''(B_t)dt. $$ Now, if we take insted a process $X_t= x+B_t$ then $$ \mathsf E_x f(X_T) = \int\limits_0^T\frac12 \mathsf E_xf''(X_t)dt. $$ where $\mathsf E_x$ is an expectation under the condition that $X_0 = x$. If we denote $m(t,x) = \mathsf E_xf(X_t)$ then $$ m_t = \frac12m_{xx} = \frac12\Delta m\quad (1) $$
For the multidimensional case you just take $X_t = (x^1,...,x^n)+(B^1_t,...,B^n_t)$ where $B^i_t$ are mutually independent Brownian motions. $\frac12\Delta$ is called an infinitesimal generator of the stochastic process.
Edited: lets introduce a transition operator $P_t[f(x)] = \mathsf E_x[f(X_t)]$. Then an infinitesimal generator of the process $X$ is simply $$ \mathcal Af(x) = \lim\limits_{t\to0}\frac{P_t[f(x)] - f(x)}{t}, $$ for $X_t = x+B_t$ it follows from Ito's lemma. In that case, of course, $\mathcal A = \frac12\Delta$. Moreover, $P_t$ and $\mathcal A$ commute for all $t\geq 0$ which justifies (1).
-
0@user11995: excuse me, could you please clarify which passage do you mean? – 2011-07-06
-
0Thank you for this fast response. Nevertheless I have a few doubts about the passage: $m_t= \frac{1}{2}m_{xx}$ to $$ E_x[f(X_t)]= \int_0^T \frac{1}{2}E_x[f''(X_t)]\, dt. $$ Obviously $E_x[f(X_t)]$ is a primitive of $\int_0^T \frac{1}{2}E_x[f''(X_t)]\, dt$. I do not see following in order to deduce. Sincerly. – 2011-07-06
-
0@user11995: I've edited. For more details why this operators commute you can be referred to M. Davis, "Markov Models and Optimization", section 14. If you would like, I can type a proof here tomorrow. For sure you will find it in any book on Markov processes. – 2011-07-06
-
0@user Good question. There is no mystery here: you know that $m_t(t,x)=\frac12E_x(f''(X_t))$ but $E_x(f''(X_t))$ is simply $E(f''(x+B_t))=(E(f(x+B_t))''=m_{xx}(t,x)$, hence you are done. The only step which needs some care is that one can exchange the second derivative and the expectation, and this holds at least for every bounded function $f$. – 2011-07-06
-
0Good evening Didier Piau for Thank you for your contribution. I am confused between integration w.r.t $t$ and w.r.t $x$. when you write $f''(x+ B_t)$, the derivative is w.r.t $t$ or $x$. – 2011-07-06
-
0@user11995, consider it as you take a second derivative of $f(y)$ with respect to $y$ and then substitute $y = x+B_t$. Informally this will mean the derivative w.r.t. $x$ for any fixed $t$ – 2011-07-07
-
0@user When one writes $f''(x+B_t)$ one means the value of $f''$ at the point $x+B_t$, so your dilemna about *derivative with respect to $t$ or with respect to $x$* is moot. (But please begin your comments with this @ thing.) – 2011-07-08