Is it true that for algebraic sets $V,W$ we have $I(V \times W ) =I(V) + I(W)$?

This is a follow up question to my previous question here. Let $k$ be a field and $V \subseteq \Bbb{A}^n$ and $W \subseteq \Bbb{A}^m$ be algebraic sets. Then it should be true that $I(V \times W ) = I(V) + I(W)$ where by $I(V)$ here we mean the extension of $I(V)k[x_1,\ldots,x_{m+n}]$. Now I believe I have proven this (see the proof at the bottom of my question) but when I look at Martin's answer here, it is instead claimed that we actually have

$$I(V \times W )= \sqrt{I(V) + I(W)},$$

and for $I(V) + I(W)$ to be a radical ideal we need $k$ to be algebraically closed.

My question is: What's going on here? I believe my claim is true even without the assumption that $k$ is algebraically closed.

Here's a proof user Sanchez told me of, which I have simplified:

First it is clear that we always have $I(V) + I(W) \subseteq I(V \times W)$. For the reverse inclusion consider a polynomial $f \in I(V \times W)$. Then we can always write $$f = \sum_{i=1}^n f_ig_i$$ where $f_i \in k[x_1,\ldots,x_n]$ and $g_i \in k[x_{n+1},\ldots,x_{m+n}]$. Now take any $b' \in W$. If for all $i$ we have $g_i(b') = 0$ then since $b'$ is arbitrary, $g_i \in I(W)$ for all $i$. Then $f \in I(W) \subseteq I(V ) + I(w)$ and we are done. Otherwise suppose there exists $b \in W$ and $i$ such that $g_i(b) \neq 0$. Then wlog we may suppose that $g_1(b) \neq 0$.

Next, $\sum f_ig_i(b) = 0 $ on all of $V$ by assumption of $f \in I(V \times W)$. So $\sum f_i g_i(b) = p$ for some $p \in I(V)$. Now write $$f_1 = \frac{ p - g_2(b) f_2 + \ldots g_n(b)f_n}{g_1(b)}.$$

Substituting this for $f_1$ in $\sum f_ig_i$, followed by taking things mod $I(V)$ we get an expression with only $n-1$ terms $\mod{I(V)}$. Continuing this process we will finally get an expression with $0$ terms $\mod{I(V)}$ so that $f \in I(V)$. This shows $I(V \times W) \subseteq I(V) + I(W)$ which completes the proof.


Your proof is correct. At first sight this contradicts with the statements in the other thread. Let me clear this confusion. Let $k$ be a field. By $x$ I mean a system of variales $x = x_1,\dotsc,x_n$, similarily and $y=y_1,\dotsc,y_m$.

(A) For radical ideals $I \subseteq k[x], J \subseteq k[y]$ the generated ideal $\langle I,J \rangle \subseteq k[x,y]$ is radical.

(A') The tensor product of reduced $k$-algebras is reduced.

(B) For algebraic subsets $V \subseteq k^n, W \subseteq k^m$ we have $I(V \times W) = \langle I(V) ,I(W) \rangle$ in $k[x,y]$.

Then we have $(A) \Leftrightarrow (A') \Rightarrow (B)$ and $(A')$ holds if $k$ is algebraically closed. This is what I have used to prove that $(B)$ holds in the other thread. But your proof shows that $(B)$ holds in general. Good to know that!

But $(A')$ does not hold in general, as I have explained in the other thread. Namely, let $n=m=1$, $k = \mathbb{F}_p(t^p)$, $I=(x^p-t^p)$ and $J=(y^p-t^p)$. These are even maximal ideals with $k[x]/I \cong \mathbb{F}_p(t) \cong k[y]/J$. But their sum $\langle I,J \rangle = (\mathbf{(x-y)^p},x^p-t^p)$ is not a radical ideal. Note that although $I,J$ are radical ideals, they cannot be written as vanishing ideals for algebraic subsets of $k$, since we have $V(I)=\emptyset$ and $I \neq (1)$ (similarily for $J$).

Finally, let me mention the "correct" framework for classical algebraic geometry over a field $k$ which is not algebraically closed field. One chooses an algebraic closure $\overline{k}$, and defines $V(I) = \{\alpha \in \overline{k}^n : \forall p \in I (p(\alpha)=0)\}$. The vanishing ideal is defined as usual. One gets a Galois connection between ideals of $k[x_1,\dotsc,x_n]$ and subsets of $\overline{k}^n$, which restricts to a duality between radical ideals of $k[x_1,\dotsc,x_n]$ and algebraic subsets of $\overline{k}^n$. This is Hilbert's Nullstellensatz for an arbitrary field $k$. The corresponding statement $(B)$ for algebraic subsets of $\overline{k}^n$ and $\overline{k}^m$ therefore turns out to be equivalent to $(A')$, and thus does not holds in general. This is also why I claimed in the other thread that $(B)$ fails when $k$ is not assumed to be algebraically closed.