It is easy to spot a discrete integer valued random variable by looking at its characteristic function, as that is periodic with period $2 \pi$, i.e. for binomial distribution it is $\phi(t) = (1-p+p \, \mathrm{e}^{i t})^n$.
I recently came across Khintchine's result that $\phi(t) = \frac{\zeta(s + i t)}{\zeta(s)}$ is a characteristic function of a random variable for $s > 1$. After some fudging, I determined that it corresponds to $x_k = -\log(k)$, where $k$ follows Zipf distribution with parameter $s-1$. Indeed:
$ \mathbb{E}( \mathrm{e}^{ -i t \log(k)} ) = \mathbb{E}( k^{-i t} ) = \sum_{k \ge 1} k^{-i t} \frac{k^{-s}}{\zeta(s)} = \frac{\zeta(s+i t)}{\zeta(s)} $
This characteristic function, thus, also corresponds to a discrete random variable.
This brings up a question: Can one easily spot a discrete random variable from it's characteristic function ? Or is inverting the characteristic function the only way ? How does one go about doing the inversion ? Ordinary inverse Fourier transform would produce distributions, right ?
Thank you.