I've been wondering about the following question: Suppose that $P$ is a polynomial of degree $n$ with complex coefficients. Assume that $a_0, a_1, \dots, a_n \in \mathbb{C}$ are distinct. Are the polynomials $P(x + a_0), P(x + a_1), \dots, P(x + a_n)$ linearly independent?
Are translations of a polynomial linearly independent?
-
0@Yuval: You're right. For the general case directly calculating the determinant doesn't seem very nice, though. – 2010-11-30
4 Answers
Consider the matrix of coefficients, where each row corresponds to an evaluation at a point, and each column to a power of $x$. Any linear dependency among the columns amounts to some degree~$n$ polynomial in the point, which is supposed to have $n+1$ roots. Since this can't happen, the polynomials are independent.
Example (almost Vandermonde) - $P(x) = x^2$: $\begin{pmatrix} 1 & 2a & a^2 \\ 1 & 2b & b^2 \\ 1 & 2c & c^2 \end{pmatrix}$ A linear dependency of the columns is a quadratic polynomial having three different roots $a,b,c$.
EDIT: Just to clarify what happens in the general case. Here's the matrix corresponding to $P(x) = x^2 + \alpha x + \beta$: $\begin{pmatrix} 1 & 2a + \alpha & a^2 + \alpha a + \beta \\ 1 & 2b + \alpha & b^2 + \alpha b + \beta \\ 1 & 2c + \alpha & c^2 + \alpha c + \beta \end{pmatrix}$ Since the degree of the polynomial in column $t$ is $t$ (if we start counting from zero), it's easy to see that the any non-zero combination of columns results in a non-zero polynomial (consider the rightmost non-zero column).
-
0The tricky part is actually to ensure that the columns cannot have a non-trivial linear combination that equals zero. I tried to consider the polynomial $P(x+y)$ and evaluate it for $x=0,1,\dots,n$ to get the matrix, but then the argument fails. – 2011-02-15
Here a fancy proof that uses operators and differentiation.
Let $T_a$ be the translation operator which maps polynomials to their translates
$ T_a p(x) = p(x + a) .$
Restricting attention to polynomials of degree $n$ or lower, this is a map from an $n+1$-dimensional vector space to itself. The question to be resolved is whether the $n+1$ translation operators $T_{a_0}, T_{a_1}\dots, T_{a_n}$ are linearly independent (when applied to polynomials of degree $n$).
It is well-known that the translation operator can be written as the exponentiation of the differentiation operator Dp(x) := p'(x) as follows
$ T_a = e^{aD} = 1 + \frac1{1!} aD + \frac1{2!} a^2D^2 + \dots + \frac1{n!} a^n D^n .$
This is Taylor's theorem. Note that since $D$ is nilpotent on the space of polynomials of degree $n$, $D^{n+1} = 0$, the series is actually a finite sum.
Now, the point is that for any polynomial $p(x)$ of degree $n$, we know that its derivatives $p(x), Dp(x), D^2p(x), \dots, D^n p(x)$ are linearly independent because the degree always decreases by 1. Hence, they form a basis of our vector space of polynomials. The expansion of the $T_a$ in terms of this new basis is given by the expression above.
To show that the translates $T_{a_k} p(x)$ are linearly independent, we only have to check that their matrix with respect to the new basis is nonsingular. It reads
$\begin{pmatrix} 1 & 1 & \dots & 1 \\ a_0 & a_1 & \dots & a_n \\ \frac1{2!} a_0^2 & \frac1{2!} a_1^2 & \dots & \frac1{2!} a_n^2 \\ \vdots \\ \frac1{n!} a_0^n & \frac1{n!} a_1^n & \dots & \frac1{n!} a_n^n \\ \end{pmatrix}$
But its determinant is clearly a non-zero multiple of the Vandermonde determinant, which is non-zero if and only if the values $a_0,a_1,\dots,a_n$ are pairwise distinct.
(Another way to see that this matrix is non-singular is to note that the Vandermonde determinant is related to the interpolation problem, whose solution is unique.)
If you write $P(x) = \sum_{j=0}^n p_j x^j$ and set $ \sum_{i=0}^n c_i P(x+a_i) = 0$, the full expression is written $\sum_{i=0}^n \sum_{k=0}^n \sum_{j=0}^k c_i p_k {k \choose j} a_i^{k-j} x^j$. If we find the coefficient of each power $x_j$ as a matrix times the vector $(c_0, ..., c_n)$, and then setting the system equal to zero, we can deduce the determinant of the matrix is zero. The matrix is $A_{ij} = \sum_{k=j}^n p_k {k \choose j} a_i^{k-j}$.
Setting the determinant equal to zero gives a polynomial in all $a_i$. I don't know if this polynomial is simpler than the matrix hints at, but it is certainly possible to find coefficients $a_i$ such that the polynomials are linearly independent, though for "most" values it won't be the case, as the solution set must be of at least one dimension less than $C^{n+1}$ (seen by fixing all $a_1, ..., a_n$ arbitrarily and thereby determining a finite set of possible $a_0$).
-
0Oh yeah I forgot I was looking at the opposite case, thanks. – 2010-11-30
Here a fancy proof by induction that uses differentiation.
Case $n=0$.
A polynomial of degree $0$ has the form $P(x)=c_0$ with $c_0\neq 0$ which is clearly linearly independent.
Case $n=k+1$.
Assume that there is a linear dependence
$\lambda_0 P(x+a_0) + \lambda_1 P(x+a_1) + \dots + \lambda_n P(x+a_n) = 0.$
Since this relation holds for all $x$, we can differentiate this relation. After differentiating $n$ times, we obtain a relation for the $n$-th derivative
$\lambda_0 P^{(n)}(x+a_0) + \lambda_1 P^{(n)}(x+a_1) + \dots + \lambda_n P^{(n)}(x+a_n) = 0.$
But since $P$ has degree $n$, we know that $P^{(n)}(x+a_i)$ is equal to a constant $c\neq 0$, which implies
$ \lambda_0 + \lambda_1 + \dots + \lambda_n = 0 .$
By expressing $\lambda_0$ in terms of the other coefficients, we obtain a relation
$ \lambda_1 (P(x+a_1) - P(x+a_0)) + \dots + \lambda_n (P(x+a_n) - P(x+a_0)) = 0$
of polynomials of degree $n-1$. By induction, we conclude that all the $\lambda_i$ must vanish.
-
0Oops, sorry for my mistakes, and sorry for not noticing your comment until now. (!) I don't think the argument can be salvaged, but I have a new one in mind which is just as fancy, but hopefully works. – 2011-02-16