Let $X_1,X_2,...$ be a sequence of integer-valued random variables that converge in distribution to some random variable $X$. Am I right in thinking that we can always pick $X$ to be integer valued?
I thought like this: Let $F_n$ be the distribution function of $X_n$ and $F$ be the distribution function of $X$. Then $X$ cannot be integer valued only if there are two points $x_1$ and $x_2$ such that $F(x_1) \neq F(x_2)$ and $|x_1-x_2|<1$
Now since a distribution function cannot have more than a countable number of points of discontinuity, we can pick $x_1$ and $x_2$ such that they are points of continuity of $F$. But then for large enough $n$ $F_n(x_1) \neq F_n(x_2)$ But this contradicts the fact that $X_n$ is integer-valued.
Am I right?
[This is motivated by exercise 5.12 of Wasserman's All of Statistics where he assumes that $X$ is integer-valued]