Why the existence of Taylor series doesn't imply it coverges to the original function

Please note that I've read this question and it did not address mine.

I've been presented with the following argument regarding Taylor series:

We have a function $f(x)$, now assume that there exists a power series that's equal to it:

$$f(x)=a_0 + a_1 x + a_2 x^2 +\dots$$

One can quickly show, using differentiation, that

$$f(x) =f(0) + f'(0) x + \dfrac{f''(0)}{2! }x^2 +\dots$$

It seems that this argument implies two things:

  1. For the Taylor series around a point to exist, it has to be continuously differentiable (infinitely many times) and defined at that point.

  2. If the Taylor series for a function exists, then this implies it's equal to it, or that it converges to the original function at every point $x$.

Now I know very well that point 2 is false (not every smooth function is analytic).

But point 2 seems to be implied from the argument I presented above which assumes that if a power series exists such that it's equal to the function or in other words, converges to the function for all $x$, then it will be given by the Taylor series. So what's wrong regarding this argument above?


Solution 1:

now assume that there exists a power series that's equal to it:

This is where the problem lies. If a function is expressible by a power series at a point, then that power series is the Taylor series. But not all functions are so expressible.

Solution 2:

You are confusing the following two conditions:

  1. The function $f$ admits a Taylor series at (argument) $0$, and

  2. There exists a power series $\sum_ia_ix^i$ that converges to $f(x)$ for all $x$ (in the domain of $f$).

While the second condition implies the first (and the Taylor series will then be that $\sum_ia_ix^i$), the first statement does not imply the second. It only means that $f$ is smooth (indefinitely differentiable) in the neighbourhood of$~0$ (namely: its repeated derivatives at$~0$ will define the Taylor series). This is what the first enumerated point in your question seems to acknowledge. But you cannot get from there to the second statement, and the fact that the second implies the first is not helpful in doing so.

The only relation required between a function and its Taylor series is that those derivatives and the coefficients of the series match. For a given value $x\neq0$, the Taylor series may or may not converge, but even if it does, this does not (necessarily) give you $f(x)$. To see why not, it suffices to add to $f$ some nonzero function with zero Taylor series; your parenthesised remark seems to indicate you are aware that such functions exist (if not, consider $\exp(-x^{-2})$ extended by continuity at $x=0$). But I repeat, the Taylor series does not have to converge at all, for any $x$ (except for $x=0$).

A Taylor series is just a formal power series, a way to collect the information of all derivatives of$~f$ evaluated at$~0$. Indeed (though it takes some work to see this) every formal power series occurs as the Taylor series of some function (in fact of infinitely many of them).

Solution 3:

A standard example is $$ f(x)=\begin{cases}e^{-\frac{1}{x^2}}&x>0\\0&x\leq 0\end{cases}. $$ This function is smooth (it is actually infinitely differentiable), but if you compute the Taylor series at $0$, you'll find that all of the Taylor coefficients are zero! Therefore, this function has a Taylor series at $0$, which is convergent for all $x$, but isn't equal to the original function! This satisfies the statement that you mentioned that not all smooth functions are analytic.