Proving Thomae's function is nowhere differentiable.

I am given the function

$$f(x)=\begin{cases} 0 \text{ ; when } x \text{ is irrational} \\\frac 1 q \text{ ; for } x=\frac p q \text{ irreducible fraction}\end{cases}$$

Spivak proved that for $a\in (0,1)$, we have $$\lim_{x\to a} f(x)=0$$

That is, this function is only continuous at the irrationals. I assume we consider $f$ for $x\geq 0$, since for negative $x$ it wouldn't be defined:

If $x=-\dfrac p q=\dfrac {-p }q=\dfrac {p }{-q}$ should we take $f(x)=-\dfrac 1 q $ or $f(x)=\dfrac 1q$?

Also, this function is periodic with period $1$, so we can just prove this on $(0,1)$ (as in the first case when proving continuity).

Now, Spivak wants me to prove this function is not differentiable at $a$, for any $a$. It is clear that, since it isn't continuous at the rationals, it is not differentiable there. Thus we need to prove the claim for $a$ irrational.

He gives the following hint.

Suppose $a=n,a_1a_2a_3\dots$. Consider the expression $$\frac{{f\left( {a + h} \right) - f\left( a \right)}}{h}$$ for $h$ rational, and also for $h=-0,0\dots0a_{n+1}a_{n+2}\dots$

I have been trying to work this out for a while, but I can't. I am not sure, also, if he meant to have the $n$ in the subindices match the $n$ of $a$, or if it is only a typo. The book has $$\frac{{f\left( {a + h} \right) - f\left( h \right)}}{h}$$ instead of $$\frac{{f\left( {a + h} \right) - f\left( a \right)}}{h}$$ which is (I guess) a typo, too.

For $h$ rational, one gets $$\frac{{f\left( {a + h} \right) - f\left( a \right)}}{h} = 0$$ and for $h=-0,0\dots0a_{n+1}a_{n+2}\dots$, you get,assuming $a+h=m/u$ $$\frac{{f\left( {a + h} \right) - f\left( a \right)}}{h} = \frac{{f\left( {a + h} \right)}}{h} = f\left( {\frac{m}{u}} \right)\frac{1}{h}$$

but I really don't know what to do with this. I know I have to show the limit $$\mathop {\lim }\limits_{h \to 0} \frac{{f\left( {a + h} \right) - f\left( a \right)}}{h} = \mathop {\lim }\limits_{h \to 0} \frac{{f\left( {a + h} \right)}}{h}$$ doesn't exist for any $a$, but I can't see how.

I'm looking for good hints rather than full solutions.


First, I agree with Hurkyl that you should understand the denominator to be positive.

In the hint the index $n$ in the expansion of $h$ is not the same as the $n$ before the comma in the expansion of $a$.

If $h=-0,0\dots0a_{n+1}a_{n+2}\dots$, then $a+h=n,a_1\dots a_n$, which can be written as a fraction with denominator $10^n$. In lowest terms, therefore, its denominator is at most $10^n$, and $f(a+h)\ge 10^{-n}$. On the other hand, $|h|<10^{-n}$. Can you finish it from there? Consider letting $n\to\infty$.


Fix $\epsilon = 1$. We want to show that $\forall \delta > 0$, there exists $x + h = \dfrac{p}{q} \in (x - \delta, x + \delta)$ for which $\dfrac{f(x + h)}{h} > 1$.

Notice that if $x + h = \dfrac{p}{q}$, then $f(x + h) \ge \dfrac{1}{q}$.

Pick $q > \dfrac{1}{\delta}$. We can find an integer $p$ so that $p - 1 < qx < p$. Set $x + h = \dfrac{p}{q}$. We have:

$$ \dfrac{p}{q} - \dfrac{1}{q} < x \Rightarrow h < \dfrac{1}{q} $$

Finally:

$$ \dfrac{f(x + h)}{h} > \frac{1}{q} \cdot q = 1 $$