3
$\begingroup$

Is there a relation between the max of a Gaussian random walk of 10 steps vs the max of 10 Gaussian random walks? Specifics (in Mathematica notation):

  (* a Gaussian random walk with standard deviation 1 *)  a[0] := 0  a[n_] := a[n-1] + RandomReal[NormalDistribution[0, 1]]   (* the max of the walk over 10 steps *)  b := Max[Table[a[i],{i,1,10}]]   (* calculate max many times to get good sample set *)  (* Mathematica "magic" insures we're not using the same random #s each time *)  c = Table[b,{i,1,10000}]   (* distribution isn't necessarily normal, but we can still compute mu + SD *)  Mean[c] (* 3.66464 *)  StandardDeviation[c] (* 1.61321 *)   

Now, consider 10 people doing a Gaussian random walk of 1 step and we take the max of these 10 values.

  (* max of 10 standard-normally distributed numbers *)  d := Max[Table[RandomReal[NormalDistribution[0, 1]],{i,1,10}]]   (* get a good sample set *)  f = Table[d,{i,10000}]   (* and now the mean and SD *)  Mean[f] (* 1.54843 *)  StandardDeviation[f] (* 0.580593 *) 

The two means/SDs are obviously different, but I sense they're related somehow, perhaps by Sqrt[10], since the sum (not max) of 10 random walks is normal with SD of Sqrt[10], and I sense that somehow the cumulative sum of the first 9 somehow cancel out.

Are these known distributions?

1 Answers 1

1

I don't have an answer but only some trivial observations about your question that I post below. It got too big to be left as a comment. If I understand your question correctly, you want to compare the following two problems:

1) Let $X_i$ be i.i.d unit normals and you are interested in $X^{*} = \max(X_i, 1 \leq i \leq 10)$.

2) Let $Y_i$ be i.i.d unit normals. Define $S_0 = 0$ and $S_i = S_{i-1} + Y_i$ for i > 0 and $S^{*} = \max(S_i, 1 \leq i \leq 10)$.

It is easy to see that $P(X^{*} \leq x) = P(X_i \leq x)$ for all $1 \leq i \leq 10$ and therefore the distribution of $X^{*}$ is given by

$f_{X^{*}}(x) = 10 \Phi(x)^9 \phi(x)$ where $\phi$ and $\Phi$ are the pdf and cdf of the standard normal. From this, you can technically compute the mean of $X^{*}$ though I don't know if the messy integration yields a nice solution.

The case of determining the distribution of $S^{*}$ seems much trickier. I found some references online that study the asymptotic behavior for long random walks and even that seems very hard. You can argue that $P(S^{*} \leq s) = P(S_i \leq s)$ for all $1 \leq i \leq n$ but the $S_i$ are dependent random variables. Their joint distribution is easy to derive but once again, I don't know if the integration is manageable.

Of course, if you are only interested in comparing the first moment of $X^{*}$ and $S^{*}$, there might be a clever way to do it that avoids all the integrations and such. If there exists such a proof, I would love to learn it.

Update: I found some links that give the asymptotics for these random variables as the number of random variables $n \rightarrow \infty$. See this and this. As best as I can determine, $E(X^{*})$ grows as $\sqrt{2\ln(n)}$ and $E(S^{*})$ grows as $\sqrt{\frac{n}{\pi}}$.

  • 0
    clearly you meant P(X^*2010-12-29