I would like to prove that the sum of Gaussian processes is also Gaussian, to be precise, $M_t=W_t+W_{t^2}$, where $W_t$ is standard Wiener process. That is kind of obvious, but I am looking for some more rigorous, as short as possible proof, other than just saying that it is the sum of two Gaussian processes. I have some thoughts but something seems to be missing there. $W_t$ and $W_{t^2}$ are dependent, but $M_t=2W_t+(W_{t^2}-W_t)$ if $t^2\geq t$, then the very first proof, since they are independent, would be what I am looking for. But as I know there has to be done something different when dealing with processes, not random variables. Random process $X_t$ is Gaussian $\Leftrightarrow \forall n\geq1,t_1,\dots,t_n,\lambda_1,\dots,\lambda_n$ $\sum^n_{k=1}\lambda_k X_{t_k}$ is Gaussian. Then $\sum^n_{k=1}\lambda_k M_{t_k}=$ $\left[\sum^n_{k=1}\lambda_k W_{t_k}\right]+\left[\sum^n_{k=1}\lambda_k W_{t_k^2}\right]$, by the same proposition both terms are Gaussian since $W_t$ and $W_{t^2}$ are Gaussian. $\left[\sum^n_{k=1}\lambda_k W_{t_k}\right]+\left[\sum^n_{k=1}\lambda_k W_{t_k^2}\right]=\left[\sum^n_{k=1}2\lambda_k W_{t_k}\right]+\left[\sum^n_{k=1}\lambda_k \left(W_{t_k^2}-W_{t_k}\right)\right]$ so maybe now I could conclude by using that proof for random variables?
Sum of Gaussian processes
1
$\begingroup$
stochastic-processes
1 Answers
3
As requested by the OP, my comment has been converted into an answer.
Perhaps you are making this harder than it is. Isn't $(W_{t_1},W_{t_1^2},W_{t_2},W_{t_2^2},\ldots,W_{t_n},W_{t_n^2})$ a Gaussian vector (meaning the $2n$ random variables are jointly Gaussian) and so any linear transformation applied to this vector results in a Gaussian vector? There is no requirement that the $2n$ random variables in question have to be independent for this linear transformation property to hold. It is joint Gaussianity that is required, and joint Gaussianity is guaranteed since the random variables are from a Gaussian process.