This is related to this question. How exactly does one go about computing the limits in the answer to the linked question.
Thanks.
P.S: I would have commented on the linked question, but I don't have enough points.
So we have the sequence of constant variables $X_n = 1+1/n$ ($X_n (\omega) = 1+1/n$ for any $\omega \in \Omega$). how does one go about showing the following:
a) $ \mathop {\lim }\limits_{n \to \infty } P(1 + 1/n \le x) = P(1 \le x), $ for any $x \neq 1$, showing that the sequence converges in distribution.
b) $ \mathop {\lim }\limits_{n \to \infty } P(|(1 + 1/n) - 1| > \varepsilon ) = 0, $ for any $\varepsilon > 0$. This shows that the sequence converges in probability.
c) $ P(\lim _{n \to \infty } (1 + 1/n) = 1) = 1. $ which shows that the sequences converges almost surely.
d) $ \mathop {\lim }\limits_{n \to \infty } {\rm E}|(1 + 1/n) - 1|^p = 0. $ showing that the sequence converges in the $p$-th moment.