Polynomials vanishing on an infinite set

I'd like some help making this argument complete and rigorous (if it's correct - if not, help with that would be nice).

Here $k$ is a field.

Let $A_1,\ldots,A_n \subseteq k$ be infinite subsets. Then any polynomial in $k[x_1,\ldots,x_n]$ that vanishes on $A_1\times\cdots\times A_n\subseteq k^n$ must be $0$ (as a polynomial).

This is what I have ...

For the case $n=1$, a non-constant polynomial can only have as many roots as its degree, and in particular, it must have a finite number of roots. The only polynomial in one variable that has an infinite number of roots is $0$, so if a polynomial in $k[x_1,\ldots,x_n]$ vanishes on an infinite subset then it must be $0$.

For the inductive step, suppose the proposition is true for less than $n$ subsets and variables. Let $p\in k[x_1,\ldots,x_n]$ vanish on $A_1\times\cdots\times A_n$. Fix $x_n$ as some $a\in A_n$, and we have a polynomial in $n-1$ variables that vanishes on the set $A_1\times\cdots\times A_{n-1}$, so by the inductive hypothesis it must be identically $0$. (Now it gets sketchy). Since this is true for any of the infinite values in $A_n$, and , $p$ must be $0$.


Answered satisfactorily in the comments.