I am having trouble with the following problem.
Let c = $\int_{e}^{\infty} 1/(x \log x)^{2} dx$ and let $(X_n)_{n=1}^\infty$ be a sequence of i.i.d. random variables where $X_{1}$ has probability density function $$ f(x) = \left\{ \begin{array}{ll} (1/2)(1-c)^{2}, & \ -2/(1-c) \leq x \leq 0 \\ 1/(x\log x)^{2}, & \quad x \geq e \\ 0, & elsewhere \end{array} \right. $$ Prove that $\sum_{n=1}^{\infty} X_{n}/n $ diverges a.c.
I think that there is one way to prove this. Perhaps I can use the Kolmogorov Three-Series Criterion and show that it is not satisfied and then apply the Kolomogorov 0-1 Law, which would show that the series diverges a.c. The only problem is trying to explicitly show this using the density that I have.
Thanks for any help in advance.