I found this exercise at the end of a chapter about Brownian motion. Let $(X_j)_{j=1}^{2^M}$ be independent standard Gaussian random variables, where $M$ is a integer. Define $B_k:=2^{\frac{-M}{2}}\sqrt{T}\sum_{j=1}^kX_j$, where $0 So far I did the following: For the matrix $2^{\frac{-M}{2}}\sqrt{T}A$ I've chosen: $$
\begin{pmatrix}
1 & 0 & \cdots \\
1 & 1 & 0 & \cdots \\
\vdots & \vdots & \vdots& 0 \\
1 & \cdots & \cdots & 1
\end{pmatrix}
$$ Then clearly $2^{\frac{-M}{2}}\sqrt{T}AX=B$, seen as vectors. Hence I know that $B$ is normal distributed with mean $0$ and covariance matrix $A\cdot \mathrm{Id}\cdot A^T$. For the product $A\cdot A^T$ I got: $$
\begin{pmatrix}
1 & 1 & \cdots \\
1 & 2 & 2 & \cdots \\
\vdots & 2 & 3&3&\cdots \\
\vdots & 2 & 3 & 4& \cdots\\
\vdots & \vdots&\vdots&\vdots
\end{pmatrix}
$$
with the factor $2^{-M}T$ in front. Clearly $(W_{k2^{-M}T})_{k=1}^{2^M}$ has expectation zero and is normally distributed. But I do not see why the covariance matrices are the same. I hope someone could help me. Thanks in advance. hulik
Show that two random variables are equal in distribution
1
$\begingroup$
probability
probability-theory
brownian-motion
1 Answers
0
The covariance matrices are the same. Indeed, if $j\leq k$, then \begin{align}\operatorname{Cov}(W_{k2^{-m}T},W_{j2^{-m}T})&=\operatorname{Cov}(W_{k2^{-m}T}-W_{j2^{—m}},W_{j2^{-m}T})+\operatorname{Cov}(W_{j2^{-m}T},W_{j2^{-m}T})\\ &=\operatorname{Cov}(W_{j2^{-m}T},W_{j2^{-m}T})\\ &=j2^{-m}T\\ &=2^{-m}T\min\{j,k\}, \end{align} which is exactly what was obtained.