Why use the derivative and not the symmetric derivative?

The symmetric derivative is always equal to the regular derivative when it exists, and still isn't defined for jump discontinuities. From what I can tell the only differences are that a symmetric derivative will give the 'expected slope' for removable discontinuities, and the average slope at cusps. These seem like extremely reasonable quantities to work with (especially the former), so I'm wondering why the 'typical' derivative isn't taken to be this one. What advantage is there to taking $\lim\limits_{h\to0}\frac{f(x+h)-f(x)} h$ as the main quantity of interest instead? Why would we want to use the one that's defined less often?


Solution 1:

The symmetric derivative being defined at more places isn't a good thing.

In my mind, the main point of differentiation is to locally approximate a function by a linear function. That is, the heart of saying that the derivative $f'(a)$ exists at a point $a$ is the statement that

$$f(x) = f(a) + f'(a) (x - a) + o(|x - a|)$$

as $x \to a$, and if I were the King of Calculus this is how the derivative would actually be defined. (Among other things, this definition generalizes smoothly to higher dimensions.) Removable discontinuities are a non-issue as they should just be removed, but at a cusp we do not have this property for any possible value of $f'(a)$, so we shouldn't be talking about derivatives at such points at all. (We can talk about left or right derivatives, but this is something different.)

The symmetric derivative at $a$ is not a natural definition. It has the utterly strange property that any weirdness in a neighborhood of $a$ is ignored if it happens to be canceled by equivalent weirdness after reflecting around $a$. Let me give an example. Consider the function $f(x) = 1_{\mathbb{Q}}(x)$ which is equal to $1$ if $x$ is rational and $0$ otherwise. The symmetric derivative of $f$ at any rational point exists and is equal to $0$! Is there any reasonable sense in which $f$ is differentiable at a rational point?

The ordinary derivative, on the other hand, is sensitive to weirdness around $a$ because it compares all of that weirdness to $f(a)$.

Solution 2:

Following my comment on the Mean Value Theorem. Since MVT fails, anything we prove from MVT is likely to fail as well. For example:

Find the minimum of the (symetrically) differentiable function $f(x) = x+2|x|$ on the interval $[-1,1]$.
Usual solution: find where the derivative is zero. Answer: nowhere! Since $f'(x) = -1$ on $[-1,0)$, $f'(0)=1$, and $f'(x)=3$ on $(0,1]$.