I am curious why the following is true. The text I am reading is "An Introduction to Numerical Analysis" by Atkinson, 2nd edition, page 133, line 4.
$p(x)$ is a polynomial of the form:
$ p(x) = b_0 + b_1 x + \cdots + b_n x^n$
If $p(x) = 0$ for all $x$, then $b_i = 0$ for $i=0,1,\ldots,n$.
Why is this true? For example, for $n=2$, I can first prove $b_0=0$, then set $x=2$ to get a linear system of two equations. Then I can prove $b_1=b_2 = 0$. Similarly, for $n=3$, I first prove $b_0=0$, then I calculate the rank of the resulting linear system of equations. That shows that $b_1=b_2=b_3=0$. But if $n$ is very large, I cannot keep manually solving systems of equations. Is there some other argument to show all the coefficients must be zero when the polynomial is always zero for all $x$?
Thanks.