The mean of a random walk is expected to be 0 and the variance is expected to grow like $\sqrt{n}$. Similar for iid random varialbe with mean 0 and variance 1.
What happens if we just add them? The sum of the random variables should grow like
$ \sup \left[ Y_1 + \dots + Y_n\right] \approx \sqrt{ 2 n} \log \log n $ Intuitively $\log n$ is the number of "bits" or "digits" of $n$.
Another issue is the difference between "almost sure" and "in probability" convergence.
- For almost sure convergence, for almost every sequence of "coin-flips", the sequence $X_1, \dots, X_n \to X$.
- For convergence in probability, you measure each $X_n$ individually. $\mathbb{P}\big[|X_n - X |< \epsilon\big] \to 0$
According to Wikipedia, $\sup \frac{1}{\sqrt{2n}} \left[ Y_1 + \dots + Y_n\right] \approx \log \log n$ converges in probability but not almost surely... so although random walks thought to grow like $\sqrt{n}$ the "peak value" is growing slightly faste