5
$\begingroup$

Let $W_t$ be a standard one dimension Brownian Motion with $W_0=0$ and $X_t=\int_0^t{W_sds}$. With the help of ito formula, we could get $$E[(X_t)^2]=\frac{1}{3}t^3$$ $$E[(X_t)^3]=0$$

When I try to employ the same method to calculate the general case $E[(X_t)^n]$, I got stuck. I guess $X_t$ should be normal distribution since it could be the limit of the following $$\lim_{n\rightarrow \infty}{\sum_{i=0}^{n-1}{W_{t_i}(t_{i+1}-t_i)}},$$

where $ W_{t_i}\sim norm(0,\sqrt{\frac{t_i}{n}}).$

If it is true, the problem would be trivial.

Update: Thanks for all the suggestions. Now I believe $X_t$ is a Gaussian process.

How about for this integral $$Y_t=\int_0^t{f(W_s)ds}$$ if we assume that $f$ is some good function, say polynomial or exponential, i.e $$Y_t=\int_0^t{e^{W_s}ds}$$ $$Y_t=\int_0^t{[a_n(W_s)^n+a_{n-1}(W_s)^{n-1}+...+a_0]ds}$$

  • 0
    Re your update: are you really asking whether $Y_t$ is Gaussian? Wow.2011-03-19
  • 0
    Did you get something out of one of the answers below?2011-04-07

3 Answers 3

5

As a linear functional of the Gaussian process $(W_s)_{0\le s\le t}$, $X_t$ is a Gaussian random variable. You indicate yourself that $X_t$ is centered and has variance $\sigma^2_t=\frac13t^3$, hence $X_t$ is distributed like $\sigma_tN$ with $N$ standard Gaussian.

Thus, for every $n$, $E((X_t)^{2n+1})=0$ and $E((X_t)^{2n})=(\sigma_t^2)^nE(N^{2n})$. If you know the moments of a standard Gaussian (and you should...), you are done.

  • 0
    @Didier Piau: $X_t$ is the limit of a serial of Gaussian random variables. We need to show that the limit is Gaussian.2011-03-18
  • 0
    @Jun Deng Each $X_t$ is the mean square limit of some linear combinations of $(W_s)_{0\le s\le t}$. The Gaussian property is preserved under linear combinations and mean square limits, thus $X_t$ is a Gaussian random variable. And in fact the process $(X_t)_t$ itself is Gaussian.2011-03-18
  • 0
    @Didier Piau: Thanks. I should study limit theory later.^_^2011-03-18
  • 0
    @Didier: Do you do stuff with stochastic analysis?2011-03-18
  • 0
    It is possible to check all this (moment + gaussian nature of $X_t$) by studying the properties of the Fourrier transform of $X_t$ (or its Laplace transform if you like it better). Regards.2011-03-18
  • 0
    @Jun Deng Sorry but this is not *limit theory*, rather one of the most important and basic facts about Gaussian families.2011-03-19
  • 0
    @Jonas T: I used to, a long time ago.2011-03-19
  • 0
    @TheBridge Are you sure the method you suggest goes through? For example, how to compute $E(\mathrm{e}^{ixX_t})$?2011-03-19
  • 0
    @TheBridge: I am also curious about the distribution of $\int_0^t{e^{W_s}ds}$.2011-03-19
  • 0
    @Didier: Thank you. Where could I find the theorem?2011-03-19
  • 0
    @Jun Deng In about any textbook/lecture notes on Gaussian processes. Which one do you know/are you working on?2011-03-19
5

The random variable $X_t$ is Gaussian for the reasons in Didier's answer.

You can calculate the variance directly (without Ito's formula) as follows:

$$\mathbb{E}\left[\left( \int^t_0 W_s ds \right)^2\right] = \int^t_0 \int^t_0 \mathbb{E}(W_r W_s) dr ds = \int^t_0 \int^t_0 (r\wedge s) dr ds ={t^3\over 3}.$$

  • 0
    Thanks. Your approach is much easier and easy to extend to high dimension.2011-03-18
  • 0
    @Jun Deng You're welcome. I'm not sure how easy it is to extend this approach to higher moments. In order to work out the fourth moment, I needed to first deduce the formula $$\mathbb{E}(W_{t_1}W_{t_2}W_{t_3}W_{t_4})=t_{(1)} (2t_{(2)}+t_{(3)}),$$ where the $t_{(i)}$s are the order statistics of $t_i$. It works, but it's not that easy, and I'm not sure what the general pattern is.2011-03-18
  • 0
    Let say for triple case, $$1=\sum_{\sigma(t1,t2,t3)}I_{t_1\leq t_2\leq t_3},$$ where $\sigma(t1,t2,t3)$ means all the permutation of $t_1,t_2,t_3$. Hence, we could get the integral by symmetry. Am I right?2011-03-18
  • 0
    Yes. You are right. It can be done; but it is not obvious for the general formula. I will try again for the general case.2011-03-18
  • 0
    By the discussion before, we can suppose that $t_1$\xi_1=W_{t_1}, \xi_2=W_{t_2}-W_{t_1},...,\xi_n=W_{t_n}-W_{t_{n-1}}$, which are independent Gaussian variables. Then, we have $$E[W_{t_1}W_{t_2}...W_{t_n}]=E[\xi_1(\xi_1+\xi_2)...(\xi_1+\xi_2+...+\xi_n)],$$ Which could be done. But I don't know the explicit formula. – 2011-03-19
1

Note that $X_T = \int_0^T W_t\,dt = T W_T - \int_0^T t\,dW_t$. Thus $X_T$ is a linear combination of normally distributed random variables. (The latter integral may be easier to show that it is normal.)