I know some schemes to compute power sums (I mean $1^k + 2^k + ... + n^k$) (here I assume that every integer multiplication can be done in $O(1)$ time for simplicity): one using just fast algorithm for computing $n^k$ in $O(\lg k)$ and it's overall time is $O(n \lg k)$, the other, using Bernoulli numbers, can be implemented in $O(k^2)$. And the most complicated works in $O(n \lg \lg n + n \lg k / \lg n)$ - it uses somewhat like sieve of Eratosthenes. (Don't know if it's well-known, I came up with this by myself, so if it not well-known, I can explain how to do it)
Every of this 3 algorithms, but the first, has it's own pros and cons, for example for every $k = n^{O(1)}$ the last algorithm works in $O(n \lg \lg n)$ time, while first in runs $O(n \lg n)$ and second is even worse. When $k = o(\sqrt{n \lg \lg n})$ second algorithm performs better than others.
So my question is: if there exists some more efficient algorithms? (Like in previous problems, I am not merely interested in asymptotics in $n$, but in $k$ too)
Thank you very much!
P.S. : I've asked this question in cstheory forum, but, maybe, it's more appropriate place for it.