1
$\begingroup$

I have to prove something about normaly distributed variables.

Let $X_1, X_2, ... $ be i.i.d. with normal distribution $N(\mu, \sigma^2)$, where $\mu >0$. Define: $S_n := X_1 + X_2 + ... + X_n$ and $Z_n := \max\{S_0, S_1, S_2, ..., S_n\}$, where $S_0=0$.

I must show that random variable $Z_{\infty} = \max\{S_0, S_1, S_2, ...\}$ is almost surely finite, which means that $P(Z_{\infty} < \infty)=1$.

Any help will be very appreciated. Thanks.

  • 0
    @Sam $Z_{\infty}<1$....? That's far from being true...2017-02-12
  • 0
    Continuous probability distributions are normalised to one.$ \int^{\infty}_{0} f(x) dx = 1 $ This means that $ P(Z_{\infty} < \infty) < 1 $ however it's expectation value will have the same weight, due to some function g(x) that generates the values in $ N(\mu ,\sigma^{2}) $ So $ g(x) $ may simply be a value of x.$ \int_{a}^{b}g(x_{n})f(x)dx = X_{n} $2017-02-12
  • 0
    @saz corrected.2017-02-12

1 Answers 1

0

This is wrong. In fact, we know by the strong law of large numbers that $S_n/n \to \mu$ as $n \to \infty$. Under your assumption that $\mu>0$ it follows that $S_n \sim n\mu$ so that $S_n \to + \infty$ and thus $P(Z_{\infty}=\infty)=1$.

My guess is that the problem is stated incorrectly, because the statement becomes true if you reverse signs. In other words, it is true that $P(\inf_n S_n>-\infty) = 1$. The reason for this is exactly the same: because $S_n/n \to \mu>0$ as $n \to \infty$, so that $S_n$ can be negative for only finitely many $n$ (almost surely).