5
$\begingroup$

Let $p$ be a polynomial of known degree $n$:

$$p(x) = a_0 + a_1 x + \ldots + a_n x^n$$

Suppose we have a magic black box that can evaluate the polynomial for us. How could one then determine the coefficients $a_i$? Lagrange polynomials provide a solution, but perhaps we can do better given that we have freedom to choose where to evaluate $p$?

Of course I'm not really sure what I mean by 'better' - minimal number of operations? Perhaps there are optimal solutions for a given degree, but is tricky in general. I'll start you off with my solution for the linear case:

$$\begin{align*}a_0 &= p(0) \\ a_1 &= p(1) - a_0\end{align*}$$

And for a quadratic:

$$\begin{align*} a_0 &= p(0) \\ a_2 &= \frac{1}{2}\left[p(-1) + p(1) - 2a_0\right] \\ a_1 &= p(1) - a_2 - a_0 \end{align*}$$

  • 4
    If you can evaluate your polynomial at the roots of unity, then you can use FFT to get your coefficients...2012-04-20
  • 0
    To expand on JM's comment, an answer I [recently gave](http://math.stackexchange.com/questions/134075/primitive-roots-of-unity/134086#134086) (which wasn't the first time) tells us that $$a_\mu = \frac{1}{n}\sum_{j=0}^{n-1} \omega^{-j\mu}f(\omega^j).$$2012-04-20
  • 0
    @J.M.'s remark mentions a special case of a Vandermonde matrix that is easily (and in a numerically stable way) inverted.2012-04-20
  • 0
    Alternatively, you can use any number of [numerical differentiation methods](http://mathoverflow.net/questions/64302/64306#64306) to recover coefficients, but this route is no longer as numerically stable as the FFT route.2012-04-20
  • 0
    Or you could use a least squared approach with more than $n$ points, since you know $n$. This would have better numerical properties, I imagine.2012-04-20
  • 0
    @copper.hat's suggestion is a good one. The kink is that least squares takes a lot more computational effort than Fourier transformation...2012-04-20
  • 0
    If the polynomial has positive integer coefficients, see https://math.stackexchange.com/questions/4461302018-11-28

2 Answers 2

10

To expand my comment into an answer:

  1. If you are able to evaluate your black-box polynomial at complex values, then there is an $O(n\log\,n)$ method for generating your coefficients from those values: the fast Fourier transform. As noted by copper.hat in the comments, the matrix underlying the discrete Fourier transform (the Fourier matrix) is a special case of the Vandermonde matrix, the matrix that is involved in the solution of polynomial interpolation problems. More precisely: given your black-box $p(x)$, generate the $n+1$ values $X_k=p\left(\exp\left(-\frac{2\pi ik}{n+1}\right)\right),\quad k=0,\dots,n$ and then perform the inverse transform $c_j=\frac1{n+1}\sum\limits_{k=0}^n X_k \exp\left(\frac{2\pi i jk}{n+1}\right),\quad j=0,\dots,n$. The $c_j$ are your desired polynomial coefficients.

  2. If you can evaluate only at real values, things are slightly more difficult. One of the more usual things to do is to evaluate your polynomial at the $n+1$ Chebyshev nodes $x_k=\cos\left(\pi\frac{k-\frac12}{n}\right), k=1,\dots,n+1$ and then use the Björck-Pereyra algorithm to solve the underlying Vandermonde linear system. (See the paper I linked to for implementation details.) This is better than the usual choice of equidistant nodes since the underlying Vandermonde linear system has better conditioning (and thus makes for more accurate solutions) when the sample points are clustered appropriately, which is the case for the Chebyshev nodes.

  • 0
    Since the OP says that $n$ is known, I don't think it is necessary to have a long stretch of zeroes anywhere. You may be thinking of polynomial multiplication (or convolution of coefficients using the fast Fourier transform) which does need long strings of zeroes (zero-padding) to because the $n$-point DFT will give polynomial multiplication modulo $(x^n-1)$.2012-04-20
  • 0
    @Dilip: apparently I wasn't paying sufficient attention. I shall fix; thanks!2012-04-23
  • 0
    Method 1 works only if you don't have a constant term in your polynomial, but that's easy enough to filter out. Are there any conditions on method 2 I should be aware of before I try it out?2012-04-23
  • 0
    @wxffles: actually, it does work; it's a Vandermonde system that you're solving, after all. What were you working on that prompted this observation of yours?2012-04-23
  • 0
    @J.M. I just tried a few known test polynomials and threw them at the formulae for $X_k$ and $c_j$. It only worked when $a_0 = 0$. On further inspection it looks like $c_0 = c_n = a_0 + a_n$.2012-04-23
  • 0
    @wxffles: did you remember to use the inverse transform? What computing environment are you using?2012-04-24
  • 0
    Also, it seems I made a serious mistake. One needs $n+1$ points to properly determine a $n$-th degree polynomial. Sorry about my error, @wxffles. Could you check your computations now?2012-04-24
  • 0
    This seems to work: $X_k=p\left(\exp\left(-\frac{2\pi ik}{n+1}\right)\right), k=0,\dots,n$ and $c_j=\frac1{n+1}\sum\limits_{k=0}^{n-1} X_k \exp\left(\frac{2\pi i jk}{n+1}\right), j=0,\dots,n$2012-04-24
2

Just evaluate at the integers from $0$ to $n$. You get the right amount of information, and what you can do with it is calculate the successive differences, $\Delta_0(m)=p(m)$, and for $i\ge0$, $\Delta_{i+1}(m)=\Delta_i(m+1)-\Delta_i(m)$. Of course $\Delta_i$ is defined only for the integers from $0$ to $n-i$. Now $p(x)=\sum_{i=0}^n\Delta_i(0)C_i(x)$, where $C_i(x)$ is the $i$-th binomial polynomial, $C_0=1$, and for $i\ge1$, $C_i(x)=C_{i-1}(x)\cdot(x-i+1)/i$. To get the original coefficients $a_m$, of course you need to expand the binomial polynomials, but if you’re doing many of these problems, this last operation needs to be done only once and the results stored.