Can a function be smooth at a single point?
I saw a thread (Find a function smooth at one isolated point) in which it is asked whether or not it is possible for a function to be smooth at a point, but not smooth on a deleted neighbourhood of said point. The thread is closed with an accepted answer despite the answer seeming to be incorrect, so I would like to re-ask the question here.
I have read (here: Example of a function continuous at only one point.) that $$f=\begin{cases}x\,\mathrm{if}\,x\in\mathbb{Q}\\ 0\,\mathrm{if}\,x\in\mathbb{R}\setminus\mathbb{Q} \end{cases}$$ is only continuous at $x=0$. If it is continuous at $x=0$, then maybe $$f=\begin{cases}\sum_{k=1}^{\infty}x^k\,\mathrm{if}\,x\in\mathbb{Q}\\ 0\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\,\mathrm{if}\,x\in\mathbb{R}\setminus\mathbb{Q} \end{cases}$$ is smooth at $x=0$ and nowhere else (does this work?).
Solution 1:
The Weierstrass function is continuous everywhere and nowhere differentiable (see this proof). Given parameters $0<a<1$ and $b$ an odd integer such that $ab>1+\frac{3}{2}\pi$, this is $$w(x)=\sum_{n=0}^{\infty}a^n\cos(b^n\pi x)$$
Now note that this can be integrated to yield an everywhere $C^1$ nowhere $C^2$ function: $$W(x)=\sum_{n=0}^{\infty}\frac{1}{\pi}\left(\frac{a}{b}\right)^n\sin(b^n\pi x)$$
This is extremely similar to the original function, only that $a$ has decreased, which regularizes the sum. Clearly, this constructs everywhere $C^k$ nowhere $C^{k+1}$ functions.
To pass to smoothness, but only at one point, we need a way to somehow interpolate between the values of $a$ depending on $x$... so let it change to $x$ directly.
Try (partial plot here) $$f(x)=\sum_{n=0}^{\infty}\cos(b^n\pi x)\left|x\right|^n$$ which has good reason to locally be $C^n$ for all $n$ in neighbourhoods of $0$ of size approximately $b^{-n}$.
Following the original proof idea in (1), let's try using the Weierstrass M-test for uniform convergence of the $k$th termwise derivative of this sum. Using the product rule, the $k$th derivative of the $n$th term looks like $$f_n^{(k)}=\sum_{q=0}^{\min(n,k)}\binom{k}{q}\cos\left(b^n\pi x+\frac{(k-q)\pi}{2}\right)\left(b^n\pi\right)^{k-q}\frac{n!}{(n-q)!}\mathrm{sgn}(x)^q\left|x\right|^{n-q}$$
For sufficiently small magnitude of $x$, a crude upper bound on the magnitude of the term is $$\left|f_n^{(k)}\right|<\sum_{q=0}^k\binom{k}{q}\left(b^n\pi\right)^{k-q}n^q\left|x\right|^{n-q}<2^k\left(b^n\pi\right)^kn^k\left|x\right|^{n-k}=\left(\frac{b^k}{\left|x\right|}\right)^n\left(2\pi n\left|x\right|\right)^k$$ and recall that $k$ is a constant here. Since polynomials ($n^k$) are dominated by exponentials ($k^n$), the series converges uniformly whenever $\frac{b^k}{\left|x\right|}<1$ for all $k$.
Therefore in the neighbourhood $\left(-b^{-k},b^{-k}\right)$, the function $f(x)$ is $C^k$.
To prove the other direction, let us pick $b\geq7$ for our purposes. Suppose that $\left|x\right|$ is such that $b^k\left|x\right|<1$ and $b^{k+1}\left|x\right|>1+\frac{3}{2}\pi$, that is $$b^{-k}\left(\frac{1+\frac{3}{2}\pi}{b}\right)<\left|x\right|<b^{-k}$$ and we have chosen $b$ to make this interval nontrivial.
Note that the termwise $k$th derivative double sum is absolutely and uniformly convergent, so basically we can rearrange it as much as we want. In particular, let us split out the "principal" piece where we take the derivative of the $\cos$ all the time: $$\begin{align*}f_n^{(k)}&=\pi^k\cos\left(b^n\pi x+\frac{k\pi}{2}\right)\left(b^k\left|x\right|\right)^n\\ &\hphantom{{}={}}+\sum_{q=1}^{\min(n,k)}\binom{k}{q}\cos\left(b^n\pi x+\frac{(k-q)\pi}{2}\right)\left(b^n\pi\right)^{k-q}\frac{n!}{(n-q)!}\mathrm{sgn}(x)^q\left|x\right|^{n-q}\end{align*}$$ Call the remaining pieces $f^{(k)*}_n$ and take the derivative one more time and bound: $$\left|\left(f^{(k)*}_n\right)'\right|<\sum_{q=1}^{k+1}\binom{k+1}{q}\left(b^n\pi\right)^{k+1-q}n^q\left|x\right|^{n-q}<2^{k+1}\left(b^n\pi\right)^kn^{k+1}\left|x\right|^{n-k-1}=\left(\frac{b^k}{\left|x\right|}\right)^n\left(2\pi n\left|x\right|\right)^{k+1}$$ and the point is that under our hypothesis, $\frac{b^k}{\left|x\right|}<1$ and the Weierstrass M-test still applies to it.
What this means is that $$f^{(k)}=\underbrace{\sum_{n=0}^{\infty}\pi^k\cos\left(b^n\pi x+\frac{k\pi}{2}\right)\left(b^k\left|x\right|\right)^n}_{\text{call this }f^{(k)p}}+\sum_{n=0}^{\infty}f^{(k)*}_n$$ where we have just proven the latter sum is itself continuously differentiable. Therefore $f^{(k)}$ is itself differentiable if and only if the principal part $f^{(k)p}$ is.
Now, to take the derivative at some value $a$, we only care about a neighbourhood of $a$, so let us approximate it with the actual Weierstrass function with parameter $b^k\left|a\right|$: $$f^{(k)p}=\sum_{n=0}^{\infty}\pi^k\cos\left(b^n\pi x+\frac{k\pi}{2}\right)\left(b^k\left|a\right|\right)^n+\sum_{n=0}^{\infty}\underbrace{\pi^k\cos\left(b^n\pi x+\frac{k\pi}{2}\right)b^{kn}\left(\left|x\right|^n-\left|a\right|^n\right)}_{\text{call this }f^{(k)e}_n}$$
We take the derivative of the error termwise using first principles: $$\begin{align*}\left(f^{(k)e}_n\right)'(a)&=\lim_{h\rightarrow0}\frac{f^{(k)e}_n(a+h)-\overbrace{f^{(k)e}_n(a)}^0}{h}\\ &=\left(\lim_{h\rightarrow0}\pi^k\cos\left(b^n\pi(a+h)+\frac{k\pi}{2}\right)b^{kn}\right)\left(\lim_{h\rightarrow0}\frac{\left|a+h\right|^n-\left|a\right|^n}{h}\right)\\ &=\pi^k\cos\left(b^n\pi a+\frac{k\pi}{2}\right)b^{kn}n\;\mathrm{sgn}(a)\left|a\right|^{n-1}\end{align*}$$ and one last time, by the Weierstrass M-test the error terms are continuously differentiable.
Finally, we're left with $$f^{(k)}-(C^1\text{ function})=\sum_{n=0}^{\infty}\pi^k\cos\left(b^n\pi x+\frac{k\pi}{2}\right)\left(b^k\left|a\right|\right)^n$$ which is a Weierstrass function with parameters $b^k\left|a\right|$ and $b$, and by hypothesis this function is nowhere differentiable. So $f$ within this range is $C^k$ but not $C^{k+1}$.
Solution 2:
As I suggested yesterday, I will try to fix the example Paul Sinclair made in this post.
Let $f_0$ be a continuous and nowhere differentiable function. We define recursively for $n\geq 0$ as follows: $$ f_{n+1}(x) = \begin{cases} f_n(x),& x\leq - \frac{1}{n+1}, \\ K_{n+1}(x) + \int_{-\frac{1}{n+1}}^x f_n(y)dy,& x\in \left( - \frac{1}{n+1}; 0 \right] \end{cases} $$ where $K_{n+1}$ is a polynomial of degree $n$ ensuring that $f_{n+1}$ is $C^n$ on $\left( - \frac{1}{n+1}; 0 \right)$. In fact, $$ K_{n+1}(x)= \sum_{j=0}^n a_j \left( x+ \frac{1}{n+1} \right)^j$$ with $$ a_0 = f_n\left( -\frac{1}{n+1} \right), \qquad a_k = \frac{f_n^{(k)}\left(-\frac{1}{n+1}\right) - f_n^{(k-1)}\left(-\frac{1}{n+1}\right)}{k!} \quad \text{for } 1\leq k \leq n $$ will do the job (we just have to compare the left and the right limits of the derivatives).
Next, we define $$ f(x) = f_n(x) \qquad \text{for } x\in \left( - \frac{1}{n}; - \frac{1}{n+1} \right] $$ By construction this yields a function which is $C^n$ on $\left( - \frac{1}{n}; 0 \right)$ and such that $f^{(n)}$ is continuous, but nowhere differentiable on $\left(-\frac{1}{n}; - \frac{1}{n+1} \right]$. Now we extend $f$ to $\mathbb{R}\setminus \{ 0\}$ by defining $ f(x) := f(-x) $ for $x>0$ and $f(0):=0$.
Then this $f$ is $C^n$ on $\left( - \frac{1}{n}; \frac{1}{n} \right) \setminus \{ 0\}$ and such that $f^{(n)}$ is continuous, but nowhere differentiable on $\left(-\frac{1}{n}; \frac{1}{n} \right) \setminus \left(-\frac{1}{n+1}; \frac{1}{n+1} \right)$.
This almost yields the desired example. We just need to modify $f$ in such a way that it is smooth in zero. Following the idea by Paul Sinclair we multiply with a smooth function which is positive except for the origin (so that we do not change the regularity away from the origin) and with enough decay s.t. the function becomes smooth in the origin. For this we need to estimate the derivatives of $f$.
For all the estimates I going to do, I will denote by $ \Vert \cdot \Vert_{I}$ the supremum norm on the interval $I$.
First we note that for every $k\in \mathbb{N}$ we have $$ \Vert K_{n+1} \Vert_{[-1;0]} \leq 2 \sum_{j=0}^n \left\vert f_n^{(j)} \left( -\frac{1}{n+1}\right) \right\vert \qquad (1)$$ and (this we can prove by induction) $$ f_{n+1}^{(k)}(x) = f_{n-k+1}(x) + \sum_{j=0}^{k-1} K_{n-j+1}^{(k-j+1)}(x) \qquad (2) $$ Combining $(1)$ and $(2)$ we get $$ \Vert f_{n+1}^{(k)} \Vert_{\left[- \frac{1}{n+1};0\right)} \leq \Vert f_{n-k+1} \Vert_{\left[- \frac{1}{n+1};0 \right)} + 2 \sum_{j=0}^{k-1} \sum_{l=0}^{n-j} \Vert f_{n-j}^{(l)} \Vert_{\left[- \frac{1}{n+1};0\right)} \leq 3 \cdot (n+1)^k \max_{\substack{0 \leq s \leq n,\\ 0 \leq t \leq \min\{s,k\}}} \Vert f_s^{(t)} \Vert_{\left[- \frac{1}{n+1};0\right)} $$ Using induction we get $$ \Vert f_{n+1}^{(k)} \Vert_{\left[- \frac{1}{n+1};0 \right)} \leq 3^{n+1} \cdot (n+1)^{(n+1)\cdot k} \cdot \underbrace{\Vert f_0 \Vert_{[-1;0]}}_{=:C} $$
Now we fix $k$. Let $n\geq k+1$, then we have for $x\in \left[ - \frac{1}{n+1}; - \frac{1}{n+2} \right]$ $$ \vert f^{(k)}(x) \vert = \vert f_{n+1}^{(k)}(x) \vert \leq C \cdot 3^{n+1} \cdot (n+1)^{(n+1)\cdot k}$$ From which we conclude for all $x\in (-1;1)\setminus \{ 0 \}$ (we get it on the positive axis as we previously extended $f$ such that $f(x) = f(-x)$, hence all the estimates still hold true) $$ \vert f^{(k)}(x) \vert \leq C \cdot 3^{\frac{1}{\vert x \vert}} \cdot \left( \frac{1}{\vert x \vert}\right)^{\left(\frac{1}{\vert x \vert}\right) \cdot k} $$
Finally we have that $$ g(x) = \begin{cases} \left(\left( \frac{1}{\vert x \vert}\right)^{\left(\frac{1}{\vert x \vert}\right)^{\left(\frac{1}{\vert x \vert}\right)}}\right)^{-1} f(x),& x\neq 0\\ 0,& x=0 \end{cases} $$ is an example of a function which is only smooth at the origin.
Solution 3:
I think I have a simpler solution. It doesn't depend on nowhere differenitable functions and seems less computational to me.
For $f\in C^n=C^n(\mathbb R),$ define
$$\|f\|_n = \max_{|k|\le n} \sup_{\mathbb R}|D^k f|.$$
Suppose $a<b$ and $n$ is a positive integer. Define $g$ by setting
$$g(x)=(x-a)^{n+1}(b-x)^{n+1}\,\,\text {for } x\in [a,b],\, g=0 \text { elsewhere}.$$
Then $g\in C^n,$ but $D^{n+1}g(a),$ $D^{n+1}g(b)$ both fail to exist. Furthermore the same is true for $cg,$ where $c$ is any small positive constant. Note $\|cg\|_n = c\|g\|_n.$
Now choose a sequence $b_1>a_1>b_2>a_2 > \cdots \to 0.$ In view of the above, we can choose $f_n$ such that i) $f_n$ is supported in $[a_n,b_n],$ ii) $f_n\in C^n,$ iii) $D^{n+1}f_n(a_n)$ fails to exist, and iv) $\|f_n\|_n < 1/2^n.$
Finally, define
$$f(x)= \sum_{n=1}^{\infty} f_n(x).$$
Because the supports of the $f_n$ are disjoint, the series converges for each $x.$ Observe that on $[-b_k,b_k],$ $f=\sum_{n=k}^{\infty}f_n.$ Furthermore,
$$D^j f(x) = \sum_{n=k}^{\infty}D^jf_n(x)\,\,\text {for }x\in [-b_k,b_k],\,j=0,1,\dots k.$$
This follows from the standard result on uniform convergence and differentiation. Note that the bound $\|f_n\|_n < 1/2^n$ is what gives the uniform convergence we need.
It follows that $D^j f(0)=0$ for each $j.$ Yet it is clear that because $f=f_n$ near $a_n$ and $D^{n+1}f_n(a_n)$ fails to exist, there is no neighborhood of $0$ where $f$ is $C^\infty.$