very elementary proof of Maxwell's theorem

Maxwell's theorem (after James Clerk Maxwell) says that if a function $f(x_1,\ldots,x_n)$ of $n$ real variables is a product $f_1(x_1)\cdots f_n(x_n)$ and is rotation-invariant in the sense that the value of the function depends on $x_1,\ldots,x_n$ only through $x_1^2+\cdots+x_n^2$, then it's an exponential function of that sum of squares (so $f$ is a "Gaussian function").

Say a class of undergraduates knows only enough mathematics to understand what the theorem says (so maybe they've never heard of inner products or orthogonal matrices), but they're bright and can understand mathematical arguments. What proof of Maxwell's theorem do you show them? Can you keep it short and simple?


Solution 1:

I assume the $C^1$ regularity of all functions and the fact that $f$ is never zero. I think that it can be proved that if $f$ is somewhere zero, then it is everywhere zero. The idea I came up with is the following:

$$f(x_1,\ldots,x_n)=f_1(x_1)\cdots f_n(x_n)=\phi(x_1^2+\cdots+x_n^2)$$

Denote $r^2=x_1^2+\cdots+x_n^2$. Differentiate with respect to $x_i$ and get

$$f_1(x_1)\cdots f_i'(x_i)\cdots f_n(x_n)=\phi'(r^2)2x_i$$

Divide by $f$ and obtain

$$\frac{f_i'(x_i)}{f_i(x_i)}\frac{1}{2x_i}=\frac{\phi'(r^2)}{f(x_1,\ldots,x_n)}$$

Therefore the LHS is independent of $i$. Since each $f_i$ depends only of $x_i$ it follows that there exists a constant $C_i$ such that $$\frac{f_i'(x_i)}{f_i(x_i)}\frac{1}{2x_i} =C_i$$ From here it is easy to get $f_i=C_ie^{C_i x_i^2}$ from where the conclusion follows. $\ $ $\ $

Solution 2:

Maxwell's theorem was proved for describing distribution of particles velocities. Try to use physics intuition to explain this theorem. For example we can explain why this distribution is exponential, this way:

Distribution of kinetic energy $f(E)$ depends on distribution of kinetic energy of velocity components $\phi(E_x),\phi(E_y),\phi(E_z)$, where $E_x=\frac{m v_x^2}{2}$, $E_y=\frac{m v_y^2}{2}$, $E_z=\frac{m v_z^2}{2}$ and $E=E_x+E_y+E_z$. But this velocities are independent, so $$ f(E_x+E_y+E_z)=\phi(E_x)\phi(E_y)\phi(E_z) $$

Consider all particles with fixed kinetic energy $E$ and fixed velocity $v_z$. This particles can have different velocity components $v_x,v_y$ But since $E$ is fixed we have $$ E_x+E_y=\text{const}\tag{1} $$ $$ \phi(E_x)\phi(E_y)=\text{const} $$ The last statement can be rewritten as $$ \ln\phi(E_x)+\ln\phi(E_y)=\text{const}\tag{2} $$ Since we are use physical approach we can carelessly take differentials of (1) and (2) $$ dE_x+dE_y=0 $$ $$ \frac{\phi'(E_x)}{\phi(E_x)}dE_x+\frac{\phi'(E_y)}{\phi(E_y)}dE_y=0 $$ Then you get $$ \frac{\phi'(E_x)}{\phi(E_x)}=\frac{\phi'(E_y)}{\phi(E_y)} $$ The left hand side is a function of $E_x$, the right hand side is a function of $E_y$. This is possible if $$ \frac{\phi'(E_x)}{\phi(E_x)}=\alpha_x=\text{const} $$ The solution of this differential equation is $$ \phi(E_x)=A_x e^{\alpha_x E_x} $$ Similarly, for other components $$ \phi(E_y)=A_y e^{\alpha_y E_y} $$ $$ \phi(E_z)=A_z e^{\alpha_z E_z} $$ Since distribution of velocities $v_x$, $v_y$ and $v_z$ must be the same, we have $A_x=A_y=A_z=A$, $\alpha_x=\alpha_y=\alpha_z=\alpha$. Since probability for a particle to have very big velocity is small then $\alpha<0$ Hence $$ f(E_x+E_y+E_z)=\phi(E_x)\phi(E_y)\phi(E_z)=A^3e^{\alpha(E_x+E_y+E_z)} $$ In terms of velocities it can be rewritten as $$ f\left(\frac{m v_x^2}{2}+\frac{m v_y^2}{2}+\frac{m v_z^2}{2}\right)=A^3e^{\alpha\frac{m (v_x^2+v_y^2+v_z^2)}{2}} $$


And now an interesting fact. Maxwell derived his distribution on entrance exam! Examiner loved to give open problems in theoretical physics, and Maxwell got a task for deriving distribution of velocities of particles in the ideal gas. At the end o exam Maxwell showed a complete solution of the problem!

Solution 3:

I only demonstrate that if $f$ is zero anywhere, then it is zero everywhere, to complement the accepted answer. So write $f(x_1,\ldots ,x_n)=f_1(x_1)\ldots f(x_n)= \phi(x_1^2+\ldots+x_n^2)$. I will suppose $\phi$ to be continuous. Assume that there is some $r$ such that $\phi(r^2)=0$.

Set $R= \inf \{ \,r \mid \phi(r^2)=0 \, \}$. Suppose $R>0 $. It is clear that $\phi(R^2)=0$ and that $f_i(y) \neq 0$ for all $f_i$ if $y < R$. We get the contradiction $$ 0 \neq f_1 \left(\frac{R}{\sqrt 2}\right)f_2\left(\frac{R}{\sqrt 2}\right)f_3(0)\ldots f_n(0) =\phi(R^2)=0.$$

On the other hand if $R=0$, then for some $f_i$, we have $f_i(0)=0$. It follows easily now that $\phi$ is identically zero.

Solution 4:

The most general form of Maxwell's theorem:

$ f:R^N \to [0 , \infty] $ , $ f $ is measurable

$ x, y,p, q \in R^N $

Solve $ f (x) + f (y) = f (p) + f ( q) $, under the conditions :$ x+y=p+q, x^2+y^2=p^2+q^2$. Where $ x^2$ is the norm of $ x $

 $\psi (x, y)=f (x) f (y) $

$ \psi (x, y) $ is constant whenever $ x+y, x^2+y^2$ is constant. Therefore $\psi (x, y)= G (x+y, x^2+y^2) $

$$f (x, x^2) f (y, y^2)= G (x+y, x^2+y^2) $$

$$f (u, v^2) f (w,z^2)= G (u+w, v^2+z^2) $$

If there is $ u,v$ such that $ f (u, v^2) =0$ implies $ G (u+w, v^2+z^2)=0$ . Since $ w,z $ are arbitrary, it implies $ G(a,b^2)=0$ for all $ a, b \in R^N $ which implies $ f (n, m^2) =0$ for all $ n,m \in R^N $

Since $\int f \, dx^N > 0$, based on the result above $ f> 0$

$\varphi (x, x^2)=\log (f (x, x^2)) $

$\varphi(x, x^2)+\varphi(y, y^2)=F (x+y, x^2+y^2)$

So $\varphi (x, x^2)+\varphi(y, y^2)=F(x+y,x^2+y^2)$

$$\varphi_1(x, x^2)=\varphi(x, x^2)-\varphi(0,0)$$

$$\varphi_1(y, y^2)=\varphi(y, y^2)-\varphi(0,0)$$

$$F_1(x+y,x^2+y^2)=F(x+y,x^2+y^2)-F(0,0)$$

so that $\varphi_1(0,0)=F_1(0,0)=0$

$$\varphi_1(x, x^2) +\varphi_1(y, y^2)=F_1(x+y,x^2+y^2)$$

plug $y=0$ into the equation above and get $\varphi_1(x, x^2)=F_1(x,x^2)$ and similarly $\varphi_1(y, y^2)=F_1(y,y^2)$

Therefore $\varphi_1(x,x^2) +\varphi_1(y,y^2)=\varphi_1(x+y,x^2+y^2)$

$$\varphi_1(u,v^2) =\varphi_1(u,0)+\varphi_1(0,v^2)$$

let $f(u)=\varphi_1(u,0)$ , and $g(v)=\varphi_1(0,v^2)$

$$f(a+b)=f(a)+f(b)$$

$$g(a+b)=g(a)+g(b)$$

By solution of Cauchy functional equation for measurable functions: $f(u)=b\cdot u$ $\quad$ $g(v)=av^2$ $\quad$ and $\varphi_1(x,x^2)=ax^2+b\cdot x$ $\varphi(x,x^2)=\varphi_1(x,x^2)+\varphi(0,0)$ set $\varphi(0,0)=c$

$\varphi(x,x^2)=ax^2+b\cdot x+c$ $\quad$

$f(x) = e^{ax^2+bx+c}=Ae^{ax^2+bx}$ where $A=e^c$