The Taylor series represents a non-polynomial function as an infinite series of polynomials, so is it possible to express a polynomial function as an infinite series of non-polynomial functions?
Can a polynomial be expressed as an infinite sum of non-polynomial functions?
-
0Random relevant comment: Taylor series are nice because you can write an arbitrary smooth function (which can be complicated) as a sum of polynomials (which are relatively simple). – 2017-02-10
-
1Depends on you understanding of represents. Pointwise it is fairly obvious – 2017-02-10
-
0@mathematician Do you count $e^{-1/x^2}$ as "arbitrary smooth"? – 2017-02-10
-
1@Antitheos I define smooth to be whatever makes me correct ;) So yes I should have said analytic. – 2017-02-10
6 Answers
Without constraints on the functions, the answer is trivial. Take any family of non-polynomial functions $\phi_n(x),n>0$ such that their sum converges to some non-polynomial $\sigma(x)$.
Then define
$$\phi_0(x):=P(x)-\sigma(x)$$ and you have it:
$$\sum_{n=0}^\infty\phi_n(x)=P(x).$$
The set of non-polynomial functions is much richer than that of polynomials, so there is no symmetry between Taylor and "reverse Taylor".
Another very simple example is the family of functions that equal the desired polynomial in range $[n,n+1)$ and zero elsewhere.
Here is an interesting example I found when I messed around with this same topic a while back. This sum gives you any polynomial you want: $$\frac{(-1)^n}{n!}\sum _{k=0}^n{n \choose k}\left(-1\right)^{n-k}f(x)\left(\sin \left(x\right)+k\right)^n =f(x)$$
-
1But in order to evaulate the left hand side, you need to evaluate $f(x)$ which is not that such great deal of *Express polynomial without polynomials* because you stuck tuck it right back in there... – 2017-02-10
-
0@Laray That is actually how Mark's example works too. Strictly speaking, a polynomial multiplied by a non-polynomial is a non-polynomial, so this *does* satisfy the criteria – 2017-02-10
-
0I was (as I read the Question) thinking about to evaluate the polynomial (and derivatives/Integrals/whatever you deem necessary) at a single point to get an approximation. Just like I do at a Taylor/Laurant-Series... – 2017-02-10
-
0@Laray True, this will not do that. However, that is not what the OP asked for. Some of these answers will do that to varying degrees anyway, but I don't have to make mine be a local approximation by any means. – 2017-02-10
-
0Of course, this is because $\sum_{k=0}^n \binom{n}{k} (-1)^k k^r = 0 $ for $r \in \{0,1,\cdots, n-1\}$, and $(-1)^n/n!$ for $r=n$, which was known to Euler. – 2017-02-10
-
0@Chappers huh. Cool to know. I have only recently been able to prove it.. I just conjectured it a few years back. But shhhh...dont ruin the magic XD – 2017-02-10
-
0Turns out I've actually written [an answer](http://math.stackexchange.com/questions/1206191/) about it here before. There's also another proof that was a small part of a paper I wrote, where you look at $n\int_0^{\infty} x^{\alpha} (1-e^{-x})^{n-1} e^{-x} \, dx$ in two different ways. – 2017-02-10
-
0@Chappers could I have a link to the paper? I think the shortest proof I've seen is a combination of a combinatorial argument and then shifting the sum to show no dependence on $x$ (which was a collaborative proof to a question I asked on this topic a couple years ago) – 2017-02-10
-
0[You may](https://arxiv.org/abs/1601.04966), although treatment of the result is limited to a couple of remarks, notably after its statement in Theorem 1 and a footnote in the proof of Lemma 1. I have provided citations of Euler's original papers, at least. – 2017-02-10
Sure it's possible! Take a simple fourier series. One of my favorite:
$$x=\pi-2\left(\frac{\sin(x)}1+\frac{\sin(2x)}2+\frac{\sin(3x)}3+\dots\right)$$
For $x\in(0,2\pi)$.
$$y=-2\left(\frac{\sin(y)}1-\frac{\sin(2y)}2+\frac{\sin(3y)}3-\dots\right)$$
For $y\in(-\pi,\pi)$.
-
1your Fourier series on the RHS is not a polynomial even though it is equal to $x$ on the interval $(0, 2\pi)$ – 2017-02-10
-
0@hyportnex Yes...? – 2017-02-10
-
1the question was how to represent a polynomial not a *part* of a polynomial – 2017-02-10
-
0Suppose you are looking at $x^2+2x-1$. This is then trivial, since$$x^2+2x-1=\left(\pi-2\left(\frac{\sin(x)}1+\frac{\sin(2x)}2+\frac{\sin(3x)}3+\dots\right)\right)^2+2\left(\pi-2\left(\frac{\sin(x)}1+\frac{\sin(2x)}2+\frac{\sin(3x)}3+\dots\right)\right)-1$$ – 2017-02-10
-
2but the RHS is periodic and the LHS is not over the whole axis, so the two sides can equal only on a restricted interval, the RHS is not equal to a polynomial over the whole axis for polynomials (except for the constant) are never periodic functions. – 2017-02-10
-
0@hyportnex And so? I think my answer is fine, regardless of the fact that I had a domain restriction. – 2017-02-10
-
0Even with a domain restriction just add a piecewise function that is equal to the rest of the polynomial outside of the domain the non-polynomial is defined on. Since it's piecewise it's not really a polynomial. It doesn't end up being the complete polynomial until the infinite series converges and thus should meet the requirements. – 2017-02-10
-
2@Dason One may also point out that most series expansions have a limited radius of convergence to the original function, so nothing is unreasonable here. – 2017-02-10
$\frac{x^2}{(1-x)^k}$ is not a polynomial for any $k>0$.
Yet $$ \sum_{k=1}^\infty \frac{x^2}{(1-x)^k} = -x $$ is a polynomial.
If you want a series which converges on the entire real axis, try $$ \sum_{k=1}^\infty x^2 \left(e^{x^2}-1\right)e^{-kx^2} = x^2 $$
Here's an easy example. The hard part is actually just making sure none of the infinite number of functions in the sum is a polynomial, assuming you consider the zero function to be a polynomial.
Let $p(x)$ be an arbitrary polynomial. Then let \begin{align} f_0(x) &= \sin x \\ f_1(x) &= \begin{cases} p(x) & x < 0 \\ -\sin x & x \geq 0 \end{cases} \\ f_2(x) &= \begin{cases} -\sin x & x < 0 \\ p(x) & x \geq 0 \end{cases} \\ f_n(x) &= \begin{cases} \dfrac{\sin x}{2^{n+1}} & \text{$n>2$ and $n$ odd} \\ \dfrac{\sin x}{2^n} & \text{$n>2$ and $n$ even} \end{cases} \\ \end{align}
None of these functions is a polynomial, since each function has an infinite number of zeros. But $p(x) = f_0(x) + f_1(x) + f_2(x).$
The functions $f_n(x)$ for $n > 2$ are defined in order to satisfy the requirement to express $p(x)$ as an infinite number of non-polynomial functions (which I think is the most difficult part of this problem). Each successive pair of functions has sum zero, so $f_3(x) + \cdots + f_{2k}(x) = 0$ for any integer $k$. For $n>2,$ therefore, the partial sum $f_0(x) + \cdots + f_n(x)$ is $p(x)$ if $n$ is even and is $p(x) + \frac{\sin x}{2^{n+1}}$ if $n$ is odd.
The functions $\frac{\sin x}{2^{n+1}}$ converge to zero as $n\to\infty,$ so the sum converges to $p(x).$
Consider the sum of the sequence of characteristic functions of all intervals [n,n+1) for all integers n. None of these functions is a polynomial, yet their sum is one.
-
1Already said, more generally. – 2017-02-10