avoiding calculus

Some people may carelessly say that you need calculus to find such a thing as a local maximum of $f(x) = x^3 - 20x^2 + 96x$. Certainly calculus is sufficient, but whether it's necessary is another question.

There's a global maximum if you restrict the domain to $[0,8]$, and $f$ is $0$ at the endpoints and positive between them. Say the maximum is at $x_0$. One would have $$ \frac{f(x)-f(x_0)}{x-x_0}\begin{cases} >0 & \text{if }x<x_0, \\ <0 & \text{if }x>x_0. \end{cases} $$ This difference quotient is undefined when $x=x_0$, but mere algebra tells us that the numerator factors and we get $$ \frac{(x-x_0)g(x)}{x-x_0} = g(x) $$ where $g(x)$ is a polynomial whose coefficients depend on $x_0$. Then of course one seeks its zeros since it should change signs at $x_0$.

Have we tacitly used the intermediate value theorem, or the extreme value theorem? To what extent can those be avoided? Must one say that if there is a maximum point, then it is at a zero of $g(x)$? And can we say that without the intermediate value theorem? (At least in the case of this function, I think we stop short of needing the so-called fundamental theorem of algebra to tell us some zeros of $g$ exist!)


There's actually a famous book by Ivan Niven on precisely this subject i.e. finding extrema by purely algebraic or geometric methods. It's called, simply enough, Maxima and Minima without Calculus. It's currently published by the Mathematicial Association of America. I think you'll find it fascinating and full of problems very much like this.


We do an analysis using the Arithmetic Mean Geometric Mean Inequality.

Make the usual change of variable $x=t+\frac{20}{3}$ to eliminate the term in the square of the variable. We get the cubic polynomial $$t^3-\frac{112}{3}t+k,$$ where I didn't bother to calculate $k$. This is just the original cubic shifted horizontally.

So we want to study $f(t+\frac{20}{3})=t^3-\frac{112}{3}t$, or more generally $f(t+\frac{20}{3})=t^3-at$, where $a\gt 0$. Our function is an odd function, so from now on assume that $t\gt 0$, and let symmetry across the origin take care of the rest.

Let us maximize $2(f(t+\frac{20}{3}))^2$. We have $$2(f(t+\frac{20}{3}))^2=(2t^2)(t^2-a)(t^2-a)\\ \Rightarrow 2(f(t+\frac{20}{3}))^2=(2t^2)(a-t^2)(a-t^2).$$ We have a product of $3$ positive terms, whose sum has the constant value $2a$. It follows by the case $n=3$ of AM-GM that $$2(f(t+\frac{20}{3}))^2 \le \left(\frac{2a}{3}\right)^3,$$ with equality precisely if all the terms of the product are equal. This happens when $2t^2=a-t^2$, that is when $$t=\pm\sqrt{\frac{a}{3}}.$$ One $t$ will give you maximum and the other minimum. Don't forget to add $\frac{20}{3}$ to $t$ to get $x$.


There is a kind of algebraic structure called a real closed field. It's a structure with constants $0$, $1$, and has operations $+$ and $\cdot$ and a relation $<$.

The theory of real closed fields is interesting, because any (first-order) logical statement you can make using just $0,1,+,\cdot,<$ is true in a single real closed field if and only if it is true in all real closed fields. Similarly, the ones that are true can be proven using just first-order logic and the axioms of real closed fields -- i.e. there is a purely algebraic proof of it.

For example, for any particular algebraic function, the statement that a particular value is a local maximum can be expressed in first-order logic. Thus, if you can find a local maximum with calculus, you can find it with pure algebra too.

Of course, the axioms of real closed field typically include some variant of the intermediate value theorem: e.g.

every polynomial of odd degree has a root

This is the IVT, since an odd polynomial must be positive somewhere and negative somewhere. Of course, this can also be seen as a variant of the fundamental theorem of algebra too. Ultimately, in the purely algebraic setting, there isn't much difference between the two.