3
$\begingroup$

I have $\log X \sim Exp(\vartheta - 1)$ and I would like to show

P \Big [ \Big |\frac{1}{\frac{1}{n} \sum_{i=1}^n \log X_i} - (\vartheta - 1) \Big | > \varepsilon \Big ] \rightarrow 0 \hspace{5 mm} \forall \vartheta (n \rightarrow \infty)

In the answer to this question it states that all moments of the exponential distribution that are necessary for the strong law of large numbers exist. Therefore $\frac{1}{\frac{1}{n} \sum_{i=1}^n \log X_i}$ converges almost surely towards $\frac{1}{\vartheta - 1}$.

Can someone explain to me why the moments need to exist for the law of large numbers? And how this proof works? Many thanks for your help!

  • 2
    Wikipedia claims that the law of large numbers holds whenever the random variable has finite expectation, and you don't need any higher moments.2011-02-10

1 Answers 1

3

First, the convergence of $\frac{1}{{\frac{1}{n}\sum\nolimits_{i = 1}^n {\log X_i } }}$ is towards $\vartheta - 1$, not $\frac{1}{{\vartheta - 1}}$, of course. Second, as Yuval indicated, it suffices that the first moment (expectation) exists. This guarantees that the average converges to the expectation almost surely, hence also in probability. (An assumption of finite variance is often used for convenience, but is not necessary.) Finally, it is relevant to note that continuous functions are limit-preserving even if their arguments are sequences of random variables, see continuous mapping theorem.