9
$\begingroup$

I'd like some help making this argument complete and rigorous (if it's correct - if not, help with that would be nice).

Here $k$ is a field.

Let $A_1,\ldots,A_n \subseteq k$ be infinite subsets. Then any polynomial in $k[x_1,\ldots,x_n]$ that vanishes on $A_1\times\cdots\times A_n\subseteq k^n$ must be $0$ (as a polynomial).

This is what I have ...

For the case $n=1$, a non-constant polynomial can only have as many roots as its degree, and in particular, it must have a finite number of roots. The only polynomial in one variable that has an infinite number of roots is $0$, so if a polynomial in $k[x_1,\ldots,x_n]$ vanishes on an infinite subset then it must be $0$.

For the inductive step, suppose the proposition is true for less than $n$ subsets and variables. Let $p\in k[x_1,\ldots,x_n]$ vanish on $A_1\times\cdots\times A_n$. Fix $x_n$ as some $a\in A_n$, and we have a polynomial in $n-1$ variables that vanishes on the set $A_1\times\cdots\times A_{n-1}$, so by the inductive hypothesis it must be identically $0$. (Now it gets sketchy). Since this is true for any of the infinite values in $A_n$, and , $p$ must be $0$.

  • 0
    @QiaochuYuan , I disagree. IMHO, the argument in the OP does not quite constitute a proof yet. The comment following Pete Clark's doesn't quite seem to me to get it done, either, because the hypothesis doesn't prima facie imply that if you sub in a value $a_n\in A_n$ for $x_n$ you will get zero as an element of the field $k(x_1,\dots,x_{n-1})$. I've tried to supply what I think is missing in the above comments.2017-06-16

1 Answers 1

1

Answered satisfactorily in the comments.