0
$\begingroup$

If we have a function $f(s)$ with this form:

$ f(s) = \sum_{i=0}^{\infty} p_i s^i $

We also know that:

$ f(1) = 1 $

and

$ p_i \ge 0 \quad \text {for all $ i \ge 0$} $

Assume we can calculate $f(s)$ for any $s$, is it possible that with all the info we know, we would be able to get $p_n$ for any n?

(Actually $p_i$ is the probability that $[Z=i]$ where Z is a random variable.)

  • 0
    Then either the Richardson or Lanczos methods in the answer I linked to might be of service.2011-10-02

1 Answers 1

0

This is the discrete version of the moment problem or the infinite version of a Vandermonde matrix. One approach is that $p_0=f(0),\quad p_1=\left.\dfrac{\mathrm df(s)}{\mathrm ds}\right|_{s=0}$ and the higher $p$'s are higher derivatives at $0$. Of course, this is rather unstable numerically.

  • 0
    @ablmf: you can do [Richardson extrapolation](http://math.stackexchange.com/questions/65569/65619#65619) in conjunction with Ross's "take $s$ smaller and smaller" strategem. It doesn't completely cure the numerical instability, but you might manage to squeeze out a few more digits of accuracy as long as you don't shrink $s$ too much.2011-10-02