10
$\begingroup$

The following is a proof from Apostol's linear algebra book that $\{1,t,t^{2},...\}$ is independent. To my eye, he's dividing by zero repeatedly. Is this really as huge an error as it seems, or are there missing details that would make this rigorous?

It suffices to show that for each $n$ the $n+1$ polynomials $1,t,...,t^{n}$ are independent. If these polynomials span the zero polynomial we have $\sum_{k=0}^{n}c_{k}t^{k}=0$ for all real $t$. Putting $t=0$ we see that $c_{0}=0$. Now divide by $t$ and put $t=0$ again to find $c_{1}=0$. Repeating the process we find every $c_{k}$ is $0$, so $1,t,...,t^{n}$ are independent.

Edit: I'm treating "polynomial" as a garden-variety function from $\mathbb{R}$ to $\mathbb{R}$, and it seems like that may be part of my problem.

  • 2
    @Patrick While one might *guess* a sensible meaning, that does not change the fact that the question was initially ill-specified. Further, such division does make sense in *every* polynomial ring since monic polynomials (here $\,t\,)$ are always cancellable.2012-10-15

6 Answers 6

12

As David mentioned in his comment, the issue here is that continuity is implicitly used. When one deduces that $c_0 = 0$, you are left with $ t(c_1 + c_2 t + c_3 t^2 + \dots + c_n t^{n-1}) = 0 $ which means that for all $t \neq 0$, $c_1 + c_2 t + \dots + c_n t^{n-1} = 0$. But since polynomials are continuous functions over the reals, we can use the continuity of polynomials to see that $c_1 + c_2 t + \dots + c_n t^{n-1}$ = 0 also when $t =0$, where we find that $c_1 = 0$ and the process can continue.

This is where we use a very nice property of polynomial functions over the reals ; the only polynomial with real coefficients that vanishes everywhere over the reals is the zero polynomial.

I'm saying this because I've studied for a summer polynomials with this "vanishing property", i.e. that when evaluated in their ground ring, they're worth zero everywhere. Here the "ground ring" would be $\mathbb R$ (I wouldn't say "ground ring" is a standard term, but it works for me here.). Replace $\mathbb R$ by say, $\mathbb Z / p^n \mathbb Z$, and things get all messed up. Here's a nice result (just for show =D ) that I've managed to find out :

Theorem. Let $f(x) \in \mathbb Z / p^n \mathbb Z$. Then the associated polynomial function $f : \mathbb Z / p^n \mathbb Z \to \mathbb Z / p^n \mathbb Z$ is identically zero if and only if when we write $ f(x) = \sum_{i=0}^n c_i (x)_i $ where $(x)_i = x(x-1)\dots(x-(i-1))$ with the convention $(x)_0 = 1$, we obtain that $c_i \cdot i! \equiv 0 \mod p^n$.

Proof. ($\Longleftarrow$) Suppose $c_i \cdot i! \equiv 0 \mod p^n$ for $0 \le i \le n$. Then $ f(0) = c_0 \equiv c_0 \cdot 0! \equiv 0 \mod p^n. $ Assuming this is true for $k-1$, then $ f(k) = \sum_{i=0}^k c_i (k)_i $ and since $(k)_i = i! \begin{pmatrix} k \\\ i \end{pmatrix}$, we have $ \sum_{i=0}^k c_i (k)_i = \sum_{i=0}^k c_i i! \begin{pmatrix} k \\\ i \end{pmatrix} \equiv \sum_{i=0}^k 0 \equiv 0 \mod p^n, $ because binomial coefficients are integers.

($\Longrightarrow$) Suppose $f$ is always zero $\mod p^n$. Then since for $k \ge 0$, we have $ f(k) = \sum_{i=0}^n c_i i! \begin{pmatrix} k \\\ i \end{pmatrix}. $ Computing $f(0)$ gives $0 \equiv f(0) = c_0 \mod p^n$. By induction, $ 0 \equiv f(k) = \sum_{i=0}^k c_i i! \begin{pmatrix} k \\\ i \end{pmatrix} \equiv c_k k! \begin{pmatrix} k \\\ k \end{pmatrix} = c_k k! \mod p^n, $ so that $c_k k! \equiv 0 \mod p^n$.

This gives you a wide class of rings over which there is a bunch of polynomials that are non-zero and can vanish everywhere. Note that these rings are "discrete" in some sense, i.e. there is no notion of continuity that works in a similar fashion than with the reals, which is why we had hope that those polynomials might exist. One familiar example is of course $x^{p-1} - 1$ in $\mathbb Z / p \mathbb Z$ but there are many other such polynomials, for instance $(x^{p-1} - 1)^n q(x) \mod p^n$ where $q(x)$ is an arbitrary non-zero polynomial in $\mathbb Z / p^n \mathbb Z$. Note that those polynomials also form an ideal, which can be interesting. =)

Hope that was fun to read! And that it helps.

  • 0
    Yes, the continuity argument resolves things nicely. Thanks.2012-02-12
8

No, no division by zero is involved. There are two processes at work here: manipulation of polynomials, and evaluation of polynomials. When we manipulate polynomials, we do so as formal expressions, where "$t$" is a symbol that doesn't take on a particular value. We can then evaluate any polynomial we create this way at some given point, but the fact that we've divided out $t$ along the way doesn't mean that we've suddenly divided by zero. As an example, consider what happens in the following case: if I have a polynomial $a + bt$, and then I learn that $a = 0$, all I'm doing by dividing out by $t$ is considering the related polynomial $b$.

  • 4
    I don't understand why you got 9 votes ; your explanation is wrong. Considering the polynomials as formal expressions really does not help at all here ; they're considered as functions all along. As formal expressions, the set $\{ 1, t, t^2, \dots \}$ is always independent over **any** ring, i.e. in the ring $R[x]$ of formal polynomials. This is really not the issue here. This is why I am downvoting.2012-02-13
4

Continuity isn't needed if we invoke the Factor Theorem (FT) for a general root $\rm\,r\,$ (vs. root $\rm\,r= 0).$

Over a $\rm\,\color{#C00}{domain}\ D,$ assume a polynomial $\rm\, f\in D[x]\,$ has more roots than its degree. We prove by induction on degree $\rm\,f\,$ that all coefficients of $\rm\,f\:$ are $\,0.\,$ If $\rm\,f\,$ has degree $\,0\,$ then $\rm\,f\,$ is constant, say $\rm\:f = c\in D.\,$ Since $\rm\,f\,$ has a root, $\rm\,c = 0.\:$ So all coefficients of $\rm\,f\,$ are $\,0.\,$ Else $\rm\,f\,$ has degree $\ge 1,\:$ so $\rm\,f\,$ has a root $\rm\,r.\,$ By FT, $\rm\ f = (x\!-\!r) g,$ $\rm\: g\in D[x]. $ Too $\rm\,g\,$ has more roots than its degree, since all roots $\rm\,s \ne r\,$ are roots of $\rm\,g\,$ by $\rm\,(s\!-\!r)g(s) = 0,\,$ $\rm\,s-r\ne 0\,$ $\Rightarrow$ $\rm\,g(s)=0,$ by $\rm\color{#C00}{domain}\ D.\:$ Therefore, by induction, all coefficients of $\rm\,g\,$ are $\,0,\,$ hence $\rm\,f = (x\!-\!r)g\: $ has all coefficients $0.\ \ $ QED

As a corollary, if $\rm\,\deg f < |D|\,$ and $\rm\,f(D) = 0,\,$ then all coefficients of $\rm\,f\,$ are $\rm\,0,\,$ i.e. if $\rm\,f\,$ is zero as a function, then it is zero as formal polynomial. In particular this is true for any infinite domain $\rm\,D,\,$ so the ring of polynomial functions over an infinite domain $\rm\,D\,$ is isomorphic to the ring $\rm\,D[x]\,$ of formal polynomials over $\rm\,D.$

The proof fails over non-domains, e.g. $\rm\,x^2\!-\!1 = (x\!-\!1)(x\!+\!1)\,$ has $\,4\,$ roots $\,\pm1,\pm3\,$ over $\, \mathbb Z/8.\,$ Notice how the proof breaks down due to the existence of zero-divisors: notice $\,3\,$ is a root since $\,2\cdot4\equiv 0,\,$ but $\,3\,$ is not a root of either $\rm\,x\!-\!1\,$ or $\rm\,x\!+\!1;\,$ i.e. $\rm\,x\!-\!3\,$ divides $\rm\,(x\!-\!1)(x\!+\!1)\,$ but doesn't divide either factor, so it is a non-prime irreducible. This yields the nonunique factorization $\rm\,(x-3)(x+3)\equiv (x-1)(x+1).$

3

I think the proof is fine. Fix an $n$. Suppose the polynomials $1, t, \ldots, t^{n}$ were not linear independent. Then there would exist $c_{i} \in \mathbb{R}$ such that $c_{0} + c_{1}t + c_{2}t^{2} + \cdots + c_{n}t^{n} = 0$ for all $t \in \mathbb{R}$. In particular, this must be true for $t = 0$. Setting $t = 0$ in the above degree $n$ polynomial, we have $c_{0} = 0$. Then our degree $n$ polynomial becomes that $t(c_{1} + c_{2}t + c_{3}t^{2} + \cdots + c_{n}t^{n - 1}) = 0$ must be true for all $t \in \mathbb{R}$. This implies that $c_{1} + c_{2}t + c_{3}t^{2} + \cdots + c_{n}t^{n - 1} = 0$ for all $t \in \mathbb{R}$. In particular this is true for $t = 0$. By similar reasoning as before $c_{1} = 0$. Repeating this process, we have that all the $c_{i}$'s are equal to 0 and hence $\{1, t, \ldots, t^{n}\}$ form an linearly independent set.

  • 3
    I haven't read Apostol, but I assume he is talking about the $t^k$ as functions on $\mathbb{R}$. If he were talking about linear dependence in the ring $\mathbb{R}[t]$, the result would be tautological.2012-02-12
3

I'm posting this because my browser didn't find the word "infinite" in the other answers.

The key point, it seems to me, is that, over an infinite domain, the algebra morphism from polynomials to polynomial functions is injective.

More carefully stated: Let $A$ be an infinite domain, let $X$ be an indeterminate, let $B$ be the $A$-algebra of polynomial functions from $A$ to $A$. Then the natural $A$-algebra morphism from $A[X]$ to $B$ is injective.

(I'm sure everybody knows the proof.)

As pointed out by David Speyer, the claim is tautological for $A[X]$.

(We can replace "injective" by "bijective" above if we want, but the crucial property is injectivity.)

  • 0
    Yes, that's an immediate corollary of the proof I gave. I agree it is worth explicit emphasis. It is also discussed in some prior answers here.2012-02-20
-1

You mighty use induction for $ n= 1$ $ p (t)$ is constant on $\mathbb{R}\0$ iff $ c_{0}, c_{1}$ are zero . suppose true for $ n$ then consider $ c_{0}+...+ c_{n +1} t^{n +1}=0$ didn't substituting for $ t =0$ we get $ c_{0}=0$ and $ c_{1}+...+ c_{n +1} t^ n =0$ and hypothesis of induction we get the result.