How does one prove that a multivariate function is univariate?

The question resembles How does one prove that a multivariate function is constant? but appears to be more difficult.

Suppose that $u\colon\mathbb R^2\to\mathbb R$ is a continuous function such that at every point $(x,y)\in\mathbb R^2$ at least one of the following is true:

  1. the partial derivative $\displaystyle \frac{\partial u}{\partial x}$ exists and is equal to $0$.
  2. the partial derivative $\displaystyle \frac{\partial u}{\partial y}$ exists and is equal to $0$.

Does it follow that $u$ depends only on one of two variables $x,y$? In other words, can we prove that either 1 holds for all points or 2 holds for all points?

So far I can prove this only under the additional assumption $u\in C^1$.

Edit: Here is a proof of the $C^1$ case.

Claim 1: If $u_x(a,b)\ne 0$ for some $(a,b)$, then $u_x(a,y)=u_x(a,b)$ for all $y\in\mathbb R$.

Claim 2: If $u_y(c,d)\ne 0$ for some $(c,d)$, then $u_y(x,d)=u_y(c,d)$ for all $x\in\mathbb R$.

Once Claims 1 and 2 are proved, it follows that the premise of one of them never holds, otherwise both derivatives would be nonzero at $(a,d)$.

By symmetry it suffices to prove Claim 1. The set $E=\{y\in\mathbb R\colon u_x(a,y)=u_x(a,b)\}$ is closed by the continuity of $u_x$. Also, if $t\in E$ then $u_x\ne 0$ in some neighborhood of $(a,t)$ (again by continuity), which by assumption yields $u_y=0$ in this neighborhood, meaning $u$ does not depend on $y$ there. Thus, $E$ is also open. We conclude with $E=\mathbb R$, proving the claim.


Solution 1:

This question was answered in the affirmative by A.M. Bruckner, G. Petruska, D. Preiss, and B.S. Thomson in The equation $u_xu_y=0$ factors, Acta Math. Hungar. 57 (1991), no. 3-4, 275-278. Actually, they prove a more general result, for separately continuous functions. The proof for continuous functions is quite simple (once you see it, of course!). The main idea is in the following lemma.

Lemma 1. Let $R$ be a closed axes-aligned rectangle $R$, and let $p$ be a vertex of $R$. Then the connected component of $p$ in the set $E=\{q\in R\colon u(q)=u(p)\}$ meets a side of $R$ not containing $p$.

Assume Lemma 1 for now.

Lemma 2. In the notation of Lemma 1, $u(r)=u(p)$ for some vertex $r$ adjacent to $p$.

Proof: We may assume $p$ is the bottom-left vertex and the component of $p$ in $E$ meets the right side of $R$. Then $E$ separates the bottom-right vertex $r$ from the sides not containing it. This leads to a contradiction unless $u(r)=u(p)$. Lemma 2 is proved.

Now the proof is finished as follows. Suppose $u$ attains distinct values at two points $p_1$, $p_2$ lying on the same vertical line. For each point $p$ on this line pick $i\in\{1,2\}$ such that $u(p_i)\ne u(p)$. Applying Lemma 2 to all rectangles with $p$ and $p_i$ as vertices, we see that $u$ is constant on the horizontal line through $p$. Since $p$ was arbitrary, $u$ is constant on all horizontal lines. QED

It remains to prove Lemma 1. The proof involves comparison with square cones, which are functions of the form $f(x)=c\|x-a\|_1=c(|x_1-a_1|+|x_2-a_2|)$ with $c>0$. It also uses, perhaps surprisingly, the equality of components and quasicomponents in compact Hausdorff spaces (such as $E$).

We may assume $p$ is the bottom-left vertex of $R$. Suppose the component of $p$ in $E$ does not meet either the top or the right side of $R$. Then there exist disjoint sets $U,V$ which are open in $R$ and cover $E$ in such a way that $U$ contains $p$ and $V$ contains both the top and the right side of $R$. There exists $\epsilon>0$ such that $|u(q)-u(p)|\ge 2\epsilon \|q-p\|_1$ for all $q\in R\setminus (U\cup V)$. The set $K=\{q\in U\colon |u(q)-u(p)|\le \epsilon \|q-p\|_1\}$ is compact and nonempty. Let $r$ be its largest point in the lexicographic order. The points $q\in U$ such that $q_1\ge r_1$ and $q_2\ge r_2$ satisfy $|u(q)-u(r)|\ge \epsilon \|q-r\|_1$. In particular, this holds when $q$ approaches $r$ from the right or from above (note that the construction of $U$ makes such approaches possible). But either $u_x$ or $u_y$ vanishes at $r$ -- a contradiction.