The original question is as follows:
$\left\{Y_n\right\}$ are i.i.d random variables. Find the sufficient and neccessary condition for $(\max_{m\le n}Y_m)/n \rightarrow 0$ a.s.
The answer is $EY_1^+ < \infty$. Then $\sum P(Y_n/n > \epsilon)\le EY_1^+< \infty \implies \limsup\left\{Y_n^+/n \le 0 \right\}$ a.s. I can't see why it implies $(\max_{m\le n}Y_m)/n \rightarrow 0$ a.s.
My attempt is as follows:
Let $E=\left\{Y_n^+/n \le 0 \text{ i.o.}\right\}$. Then $P(E)=1$ and for $\omega \in E$, $Y_n^+(\omega)/n \le 0 \text{ i.o.}$ Why will this implies $(\max_{m\le n}Y_m)/n \rightarrow 0$ a.s.? One counter-example I have in mind is let all even number satisfy this condition while all odd number don't.