I've edited the answer twice without deleting the previous versions, to keep the history of the answer and the comments transparent. That makes it much longer than it originally was, but if you're just interested in the end result, the first and last paragraphs make up a concise and complete proof.
If $S_k$ converges, then the differences $\Delta S_k := S_{k+1} - S_k$ must converge to zero. Hence the differences $\Delta^2 S_k := \Delta S_{k+1} - \Delta S_k$ must also converge to zero, and so on for $\Delta^n S_k$ for all $n$. But $\Delta^n S_k = \sum_{i=1}^{n} (c_i - 1)^n c_i^k$. There can be either one or two values of $c_i$ for which the absolute value of $c_i - 1$ is maximal. If there is only one, the corresponding term will dominate the sum for sufficiently large $n$, in the sense that its absolute value becomes greater than the absolute value of the sum of all the other contributions. Since this term only converges to zero if $c_i - 1 = 0$, it follows that $c_i = 1$ for all $i$.
If there are two (conjugate) maximal values of $c_i - 1$, there is a linear combination of $S_k$ and its complex conjugate in which only one of these maximal values occurs; if $S_k$ had a limit, so would its complex conjugate and this linear combination.
Edit in response to kevincuadros's question about the linear combination part:
In trying to clarify this, I can see now why you didn't "fully get" it -- because I hadn't thought it through properly :-)
What I had in mind was this: If there are two different values $c_i$ with maximal absolute value of $c_i - 1$, they are conjugate. Both may occur more than once; let $\mu_1$ and $\mu_2$ be their multiplicities. Then $\mu_1 S_k - \mu_2 S_k^\mathrm{*}$ will contain one of them with multiplicity $\mu_1^2 - \mu_2^2$ and the other with multiplicity $\mu_1\mu_2 - \mu_2\mu_1=0$. I was thinking that we could then reason that since this linear combination contains only one of the two, we can apply the above proof to it, and then argue that if $S_k$ had a limit, then so would $S_k^\mathrm{*}$ and any linear combination of them. But I see now that that doesn't work out because we could have $\mu_1=\mu_2$, and hence $\mu_1^2-\mu_2^2$, and in that case neither of the two would occur in that linear combination, so we still have to deal with that case.
So in that case, the contributions from these conjugates cancel in the imaginary part and add up in the real part. If $S_k$ is to converge, its real part must converge. But $(c_i - 1)^n c_i^k$ comes arbitrarily close to being real infinitely often; thus its real part comes arbitrarily close to $|c_i - 1|^n$ infinitely often, and hence the argument for the general case carries through.
To make the full proof more concise, we can forget about the whole linear combination thing and just argue as follows: If there are two distinct values of $c_i$ with equal maximal absolute value of $c_i - 1$, they are conjugate. Then consider the sum of the real parts of their contributions, which comes arbitrarily close to the sum of their absolute values infinitely often, and hence the argument for the general case carries through.
Further edit to salvage the style of the proof:
I don't really like the above fix, since part of the point of the original proof was to avoid saying something like "x gets arbitrarily close to y an infinite number of times". So here's a slightly nicer fix:
If there are two distinct values of $c_i$ with equal maximal absolute value of $c_i - 1$, they are conjugate. Then the real parts of their contributions add up to $|c_i-1|^n$ times the cosine of an angle that changes with $k$. Since the ratio of $|c_i-1|^n$ to the absolute value of the sum of all other terms goes to infinity with increasing $n$, increasing $n$ will yield arbitrarily small upper bounds on the cosine. But since $c_i \neq \pm 1$, the cosine necessarily violates these bounds. (Note that this doesn't use the property that the cosine gets arbitrarily close to 1, only that it doesn't remain arbitrarily close to 0, which is much less and doesn't require an argument using rational and irrational numbers.)