Converse of Taylor's Theorem
Solution 1:
For $n > 1$ we need some regularity assumptions on the $f_k$ (or on $R_n$). Without such assumptions, consider $$f(x) = \begin{cases}\qquad 0 &\text{if } x = 0 \\ x^{n+1}\sin (x^{-n}) &\text{if } x \neq 0\end{cases}$$ and $$f_k(x) = \begin{cases}\quad 0 &\text{if } x = 0 \\ f^{(k)}(x) &\text{if } x \neq 0\end{cases}$$ for any interval with $0 \in (a,b)$. Outside the origin $f$ is analytic, hence the remainder term $R_n(x,h)$ is $o(h^n)$ for every $x \in (a,b) \setminus \{0\}$ (but the constants depend on $x$ of course). For $x = 0$ we find $$\frac{R_n(0,h)}{h^n} = h\sin (h^{-n})\,,$$ thus $R_n(x,h) = o(h^n)$ for every $x \in (a,b)$. But $f$ is differentiable only once, since $$f_1(x) = f'(x) = \begin{cases} \qquad 0 &\text{if } x = 0 \\ (n+1)x^n\sin(x^{-n}) - n \cos (x^{-n}) &\text{if } x \neq 0\end{cases}$$ isn't even continuous at $0$.