Why does the higher order derivative test work?
I'm an AP Calculus BC student, so all I know about derivatives is the increasing/decreasing/ relative max/min function relation with first derivative (first derivative test), concavity (second derivative test). I showed that you can create a third derivative test to my teacher (by showing that if the critical numbers of the second derivative = 0, and if the third derivative is non-zero, then it must be a point of inflection as the function is either increasing or decreasing on the second derivative graph), and he said it was right and to think about derivatives at higher orders like x^5 only has a derivative at x = 0 with the fifth derivative).
When i googled it, i found the higher order derivative test. Is there a logical explanation for why this test works? I dont understand why if n-1 derivatives = 0 and the nth is non-zero, you can find max/mins/inflection points.
One way to understand how the test works is by looking at the Taylor Series of the function $f(x)$ centered around the critical point, $x = c$:
$$ f(x) = f(c) + f'(c)(x-c) + \frac{f''(c)}{2}(x-c)^2 + \cdots $$
Note: In your question you said that the n-th derivative is non-zero. Here I'm assuming the n+1-st derivative is the first to be non-zero at $x=c$. It doesn't make a difference, it's just the way I learned it.
If $f'(c) = \cdots = f^{(n)}(c) = 0$ and $f^{(n+1)} \ne 0$, then the Taylor Series ends up looking like this:
$$ f(x) = f(c) + \frac{f^{(n+1)}(c)}{(n+1)!}(x-c)^{n+1} + \frac{f^{(n+2)}(c)}{(n+2)!}(x-c)^{n+2} + \cdots $$
Consider what happens when you move $f(c)$ to the other side of the equation:
$$ f(x) - f(c) = \frac{f^{(n+1)}(c)}{(n+1)!}(x-c)^{n+1} + \frac{f^{(n+2)}(c)}{(n+2)!}(x-c)^{n+2} + \cdots $$
What does $f(x) - f(c)$ mean?
- If $f(x) - f(c) = 0$, then $f(x)$ has the same value as it does at $x = c$.
- If $f(x) - f(c) < 0$, then $f(x)$ has a value less than it has at $x = c$.
- If $f(x) - f(c) > 0$, then $f(x)$ has a value greater than it has at $x = c$.
We expect $f(x) - f(c) = 0$ at $x = c$ (the equation reflects this), but we're more interested in what it does on either side of $x = c$. When $x$ is really close to $c$, i.e. $(x-c)$ is a really small number, we can say:
$$ f(x) - f(c) \approx \frac{f^{(n+1)}(c)}{(n+1)!}(x-c)^{n+1} $$
because the higher powers of a small number "don't matter" as much.
Concerning local extrema
If $n$ is odd, then our approximation of $f(x) - f(c)$ is an even-power polynomial. That means $f(x)$ has the same behavior - is either less than or greater than $f(c)$ - on both sides of $x = c$. Therefore it's a local extreme. If $f^{(n+1)}(c) > 0$, then $f(x)$ is greater than $f(c)$ on both sides of $x = c$. Otherwise, if $f^{(n+1)}(c) < 0$, then $f(x)$ is less than $f(c)$ on both sides of $x = c$
If, on the other hand, $n$ is even, then our approximation of $f(x) - f(c)$ is an odd-power polynomial centered around $x = c$. Therefore $f(x)$ will be greater than $f(c)$ on one side of $x = c$, and less on the other. That means $x = c$ isn't a local extreme.
Concerning saddle points
Note that if you differentiate both sides of our approximation twice, you get:
$$ f''(x) \approx \frac{f^{(n+1)}(c)}{(n-1)!}(x-c)^{n-1} $$
If $n$ is even, this is another odd-power polynomial centered around $x = c$. It therefore has opposite behavior on each side of $x = c$, giving you a saddle point.