Let $p_1, p_2,\dots,p_n$ be polynomials of $k$ variables $x_1,\dots,x_k$ and $p_1^2 + \dots +p_n^2=x_1^2 + \dots + x_k^2$ Prove that $n \geq k$.

Let's see. To begin with, all polynomials must be linear (otherwise the square of the highest-degree term of the highest-degree polynomial has degree more than 2 and no chances to cancel out with anything).

The rest is simple: for a linear polynomial, $p(\vec x)=0$ is a hyperplane which has a normal vector. With $n<k$, $n$ vectors do not form a complete basis in a $k$-dimensional space, so there is a non-zero vector perpendicular to all of them (that is, belonging to all hyperplanes where $p_i(\vec x)=0$). So for this vector $p_1(\vec x)=\dots=p_n(\vec x)=0$, hence $p_1^2 + \dots +p_n^2=0$, but $x_1^2 + \dots + x_k^2>0$.

With $n=k$ the solutions (besides the obvious $p_i=x_i$) are plenty; just take any orthonormal basis. For example, $({2\over3}x-{1\over3}y+{2\over3}z,\; {2\over3}x+{2\over3}y-{1\over3}z,\; -{1\over3}x+{2\over3}y+{2\over3}z)$ would do in 3D.


As Ivan Neretin's answer points out, all polynomials must be strictly linear. That is, each polynomial could be written matricially as $p_j = {\bf a}_j' {\bf x}$, where ${\bf a}_j$ and ${\bf x}$ are $k \times 1$ matrices (polynomial coefficients and variables respectively).

Then $$p_j^2 = ({\bf a}_j' {\bf x})^2 = {\bf x}' A_j {\bf x}$$ where $A_j = {\bf a}_j{\bf a}_j'$ is a $k\times k$ (rank-one, positive definite) matrix.

Then [*], because $\sum_{i=1}^k x_i^2= {\bf x}' {\bf x}$ we want

$$ \sum_{j=1}^n {\bf x}' A_j {\bf x}= {\bf x}' A {\bf x} = {\bf x}' {\bf x} $$ where $A = \sum_{j=1}^n A_j $ (symmetric, positive definite). This implies $A=I_k$.

But we need to sum at least $k$ matrices of rank one to obtain a rank $k$ matrix. Hence $n\ge k$


[*] As said in the comments, an equivalent alternative way here is:

We must have ${\bf x}' A {\bf x}=\sum_{i=1}^k x_i^2>0$ for any ${\bf x}\ne {\bf 0}$. This implies $A {\bf x} \ne {\bf 0}$ (for ${\bf x}\ne {\bf 0}$), hence $A$ must have full rank ($k$). Then, the final paragraph above applies.


Having written everything out, I now realize that this merely a paraphrasing of leonbloy's answer, using explicit indexed sums and filling out the details, but figure I might as well post it at this point.

As has already been pointed out, each polynomial must be strictly linear. That is, for each $q$, we may write $$p_q(\mathbf{x})=\sum_{i=1}^k \alpha_q^ix_i,$$ so that $\alpha_q^i$ is the scalar of $x_i$ in the $q^{\text{th}}$ polynomial. But then $$\sum_{q=1}^n(p_q(\textbf{x}))^2=\left(\sum_{i=1}^k \left[\sum_{q=1}^n (\alpha_q^i)^2\right](x_i)^2\right)+\left(\sum_{i=1}^k \sum_{j=1}^k \left[\sum_{q=1}^n 2\alpha_q^i\alpha_q^j\right]x_ix_j\right).$$

Thus, if these polynomials $p_1,p_2,\ldots,p_n$ satisfy the property that $\sum_{q=1}^n(p_q(\textbf{x}))^2=\sum_{i=1}^k (x_i)^2$, then necessarily $$\sum_{q=1}^n (\alpha_q^i)^2=1$$ for all $i$, while $$\sum_{q=1}^n 2\alpha_q^i\alpha_q^j=0$$ for all $i\neq j$. Thinking of these $\alpha_i^q$ as indexed vectors in $n-$dimensional euclidean space, we see that this boils down to the statement: "you can't have $q$ orthogonal unit vectors in $\mathbb{R}^n$ unless $n\geq q$", which follows from the basis theorem of linear algebra. I hope that this makes sense. Please let me know if anything here is unclear at all. Great question!