1
$\begingroup$

I am having trouble with the following problem.

Let c = $\int_{e}^{\infty} 1/(x \log x)^{2} dx$ and let $(X_n)_{n=1}^\infty$ be a sequence of i.i.d. random variables where $X_{1}$ has probability density function $$ f(x) = \left\{ \begin{array}{ll} (1/2)(1-c)^{2}, & \ -2/(1-c) \leq x \leq 0 \\ 1/(x\log x)^{2}, & \quad x \geq e \\ 0, & elsewhere \end{array} \right. $$ Prove that $\sum_{n=1}^{\infty} X_{n}/n $ diverges a.c.

I think that there is one way to prove this. Perhaps I can use the Kolmogorov Three-Series Criterion and show that it is not satisfied and then apply the Kolomogorov 0-1 Law, which would show that the series diverges a.c. The only problem is trying to explicitly show this using the density that I have.

Thanks for any help in advance.

1 Answers 1

0

I was making this out to be a bigger issue than it needed to be.

All I had to do was show that the second series of the three-series criterion failed to converge. This can be done by integrating over the range of X up to n using the given density, dividing the result by n, and then summing up over n from 1 to $\infty$

  • 0
    Exactly, and you only need to verify that the second condition of Kolmogorov Three-Series Criterion is not satisfied. You can refer to the solution sheet for details2017-02-06