The arithmetic - geometric mean inequality states that $$\frac{x_1+ \ldots + x_n}{n} \geq \sqrt[n]{x_1 \cdots x_n}$$ I'm looking for some original proofs of this inequality. I can find the usual proofs on the internet but I was wondering if someone knew a proof that is unexpected in some way. e.g. can you link the theorem to some famous theorem, can you find a non-trivial geometric proof (I can find some of those), proofs that use theory that doesn't link to this inequality at first sight (e.g. differential equations …)?

Induction, backward induction, use of Jensen inequality, swapping terms, use of Lagrange multiplier, proof using thermodynamics (yeah, I know, it's rather some physical argument that this theorem might be true, not really a proof), convexity, … are some of the proofs I know.


Solution 1:

This is a fairly old answer of mine with a proof that was not very motivated, so I thought I'd update it with some geometric intuition about convexity.

Consider for simplicity the two-variable case $(a+b)/2 \ge \sqrt{ab}$ and fix, say $a = 1$. The plot of $(1+b)/2$ and $\sqrt{b}$ show intuitively how the concave nature of the geometric mean implies it will always lie below the arithmetic mean. Also observe the equality at one point. In fact, this concavity will extend to any number of variables, but obviously a plot is not a proof.

plot example

The proof for more than two variables presented requires elementary properties of logarithms which transforms dealing with multiplication to dealing with addition. The inequality is rewritten in terms of logarithms:

$$ \frac{x_1 + \dots + x_n}{n}\ge (x_1 \dots x_n)^{1/n} $$

Taking logs preserves the inequality since $\log$ is an increasing function:

$$\iff \log \left(\frac{x_1 + \dots + x_n}{n}\right) \ge \frac 1 n \log (x_1 \dots x_n) = \frac{\log x_1 + \dots + \log x_n}{n}$$

$\DeclareMathOperator{\E}{E}$ If we write $\E[X]$ as the mean of $x_i$'s and $\E[\log(X)]$ as the mean of $\log x_i$'s, we can also understand this in the language of expectation:

$$\log(\E[X]) \ge \E[\log (X)]$$

Using the concavity of $\log$, by Jensen's Inequality (proved inductively starting from the definition of convexity), the inequality holds.


Original post of Pólya's Proof, using similar ideas of convexity of $e^x$:

Let $f(x) = e^{x-1}-x$. The first derivative is $f'(x)=e^{x-1}-1$ and the second derivative is $f''(x) = e^{x-1}$.

$f$ is convex everywhere because $f''(x) > 0$, and has a minimum at $x=1$. Therefore $x \le e^{x-1}$ for all $x$, and the equation is only equal when $x=1$.

Using this inequality we get

$$\frac{x_1}{a} \frac{x_2}{a} \cdots \frac{x_n}{a} \le e^{\frac{x_1}{a}-1} e^{\frac{x_2}{a}-1} \cdots e^{\frac{x_n}{a}-1}$$

with $a$ being the arithmetic mean. The right side simplifies

$$\exp \left(\frac{x_1}{a} -1 \ +\frac{x_1}{a} -1 \ + \cdots + \frac{x_n}{a} -1 \right)$$

$$=\exp \left(\frac{x_1 + x_2 + \cdots + x_n}{a} - n \right) = \exp(n - n) = e^0 = 1$$

Going back to the first inequality

$$\frac{x_1x_2\cdots x_n}{a^n} \le 1$$

So we end with

$$\sqrt[n]{x_1x_2\cdots x_n} \le a$$

Solution 2:

I shall provide a simple geometric proof of the inequality in the case of two variables (which I have not been able to find anywhere else - a proof involving a triangle in a circle seems to be popular).

Consider the square of side $a + b$ in the figure below.
AM-GM

The area of the square is $(a + b)^2$. But as it completely contains the four blue rectangles, each of area $ab$, it follows that

$(a + b)^2 \ge 4ab \Rightarrow\\ \dfrac{a + b}{2} \ge \sqrt{ab} $

Further, note that there is a square in middle, of side $(b - a)$, and hence area $(b - a)^2$. Therefore the inequality is strict except when $a = b$.

This proves the two-variable case. The same can be extended to the $n$-variable case. I have tried extending it to three variables, but it is difficult to argue why exactly $27$ rectangular parallelepipeds (of sides $a, b, c$) fit in the cube (of side $a + b + c$), though I can see it is so. Any suggestions?