How badly-behaved are the derivatives of non-analytic smooth functions?
Solution 1:
[Partial answer]
Let $\epsilon>0$. Pick $c \in (-\epsilon, \epsilon)$ such that $f(c) \neq 0$ and such that $|c|<1$. Given $n$, Taylor's theorem (not Taylor's series!) implies the existence of $\xi_n$ between $0$ and $c$ such that $$f(c)=\frac{f^{(n)}(\xi_n)}{n!}c^n.$$ Therefore, $$|f^{(n)}(\xi_n)|=\frac{n!|f(c)|}{|c|^n} .$$
Since $n!|f(c)|\leq f^{(n)}(\xi_n)\leq \sup\{|f^{(n)}(x)| : x \in (-\epsilon,\epsilon)\}$, we have that $$\lim_{n \to \infty}\sup\{|f^{(n)}(x)| : x \in (-\epsilon,\epsilon)\}=+\infty.$$
This is the general case. If we impose that for every $(-\epsilon,\epsilon)$ there exists $c<0$ such that $f(c) \neq 0$, we can pick such $c$ with $-1<c<0$ and then again Taylor's theorem implies the existence of $\xi_n$ between $0$ and $c$ such that $$f(c)=\frac{f^{(n)}(\xi_n)}{n!}c^n$$ and then $$f^{(n)}(\xi_n)=\frac{n!f(c)}{c^n} .$$ Supposing without loss of generality that $f(c)>0$, we now have $(2n)!f(c)\leq f^{(2n)}(\xi_{2n})\leq \sup\{f^{(2n)}(x) : x \in (-\epsilon,\epsilon)\}$, and therefore $$\limsup_{n \to \infty} \left(\sup\{f^{(n)}(x) : x \in (-\epsilon,\epsilon)\}\right) =+\infty.$$ Analogously, $\inf\{f^{(2n+1)}(x) : x \in (-\epsilon,\epsilon)\}\leq f^{(2n+1)}(\xi_{2n+1}) \leq (2n+1)!f(c)$, and therefore: $$\liminf_{n \to \infty} \left(\inf\{f^{(n)}(x) : x \in (-\epsilon,\epsilon)\}\right)=-\infty.$$
These results certainly have some content which might interest you, but are not exactly what you want. To be explicit, the followings points must be made:
- If you are willing to analyze the absolute value of the derivatives instead of the derivatives themself, the above gives a complete answer.
- If not, we have the following issues:
- The above argument only works for functions which have points arbitrarily close to $0$ from the left with non-zero values.
- We do not prove that the sequence $\sup\{f^{(n)}(x) : x \in (-\epsilon,\epsilon)\}$ goes to infinity. Only that its $\limsup$ is infinity (analogously for $\liminf$).
Solution 2:
Another partial answer. I suspect this is essentially Aloizio's, but I reference only the MVT and IVT instead of Taylor's Theorem. Assume $f$ takes some positive value at some $x_0\in(-\epsilon,0)$. (If $f$ takes some negative value in $(-\epsilon,0)$, a symmetric argument can follow). Note $x_0$ is negative. Then by the Mean Value Theorem, there exists $x_1\in(x_0,0)$ such that $f'(x_1)=\frac{1}{x_0}\,f(x_0)<0$. And this implies $\left\lvert f'(x_1)\right\rvert>\frac{1}{\epsilon}\,f(x_0)$.
Inductively, assume that we have found $x_0<x_1<\cdots<x_{n-1}<x_n<0$, and that the sign of $f^{(n)}(x_n)$ is $(-1)^n$, and that $\left\lvert f^{(n)}(x_n)\right\rvert>\frac{1}{\epsilon^n}\,f(x_0)$. The above establishes this holds for $n=0$ and $n=1$, and models the inductive argument which follows.
Using the MVT applied to $f^{(n)}$ at endpoints $x_n$ and $0$, there is $c_{n+1}\in(x_n,0)$ such that $f^{(n+1)}(c_{n+1})=\frac{1}{x_n}\,f^{(n)}(x_n)$. By the Intermediate Value Theorem, since $f^{(n+1)}(0)=0$, then on $(c_{n+1},0)$, $f^{(n+1)}$ takes on all values between $0$ and $f^{(n+1)}(c_{n+1})=\frac{1}{x_n}\,f^{(n)}(x_n)$. In particular, it takes the value $(-1)^{n}\frac{1}{\epsilon^nx_n}f(x_0)$, which has the same sign as $f^{(n+1)}(c_{n+1})$ and is smaller in absolute value. So there is $x_{n+1}\in(c_{n+1},0)$ such that $f^{(n+1)}(x_{n+1})$ has sign $(-1)^{n+1}$ and $\left\lvert f^{(n+1)}(x_{n+1})\right\rvert=\frac{1}{\epsilon^n\lvert x_n\rvert}f(x_0)>\frac{1}{\epsilon^{n+1}}\,f(x_0)$.
This inductively proves that on $(-\epsilon,0)$, $f^{(n)}$ alternates sign, and takes an absolute value as large as $f(x_0)/\epsilon^{n}$, which goes to infinity when $\epsilon<1$. So assuming that $f$ takes a positive value at all (or a negative value, symmetrically) in $(-\epsilon,0)$, your two conjectures are true.
This means that for either conjecture to be false, then on $(-\epsilon,0)$ the function would have to be the zero function.
Now I won't repeat the argument, but on $(0,\epsilon)$ the exponential growth bound argument is the same, but you lose the alternation. If $f(x_0)$ is positive, then the first conjecture will hold by this argument. And if $f(x_0)$ is negative, then the second conjecture will hold by this argument. But it remains to show that both conjectures hold simultaneously.
If some derivative of $f$ takes a positive (negative) value, then you can pick up the argument there and to prove the first (second) conjecture holds.
So the only smooth functions for which both conjectures might not hold together would be a function that is all zero on the negatives, and has all of its derivatives positive (or, all negative) on $(0,\epsilon)$. The simultaneous conjectures reduce to showing no such function exists.
Does there exist a smooth function on $(-\epsilon,\epsilon)$ such that $f^{(n)}(0)=0$ for all $n$, $f(x)=0$ for all $-\epsilon<x<0$, and $f^{(n)}(x)>0$ for all $n$, for all $0<x<\epsilon$?