Taylor series convergence/sum question [duplicate]

now assume that there exists a power series that's equal to it:

This is where the problem lies. If a function is expressible by a power series at a point, then that power series is the Taylor series. But not all functions are so expressible.


You are confusing the following two conditions:

  1. The function $f$ admits a Taylor series at (argument) $0$, and

  2. There exists a power series $\sum_ia_ix^i$ that converges to $f(x)$ for all $x$ (in the domain of $f$).

While the second condition implies the first (and the Taylor series will then be that $\sum_ia_ix^i$), the first statement does not imply the second. It only means that $f$ is smooth (indefinitely differentiable) in the neighbourhood of$~0$ (namely: its repeated derivatives at$~0$ will define the Taylor series). This is what the first enumerated point in your question seems to acknowledge. But you cannot get from there to the second statement, and the fact that the second implies the first is not helpful in doing so.

The only relation required between a function and its Taylor series is that those derivatives and the coefficients of the series match. For a given value $x\neq0$, the Taylor series may or may not converge, but even if it does, this does not (necessarily) give you $f(x)$. To see why not, it suffices to add to $f$ some nonzero function with zero Taylor series; your parenthesised remark seems to indicate you are aware that such functions exist (if not, consider $\exp(-x^{-2})$ extended by continuity at $x=0$). But I repeat, the Taylor series does not have to converge at all, for any $x$ (except for $x=0$).

A Taylor series is just a formal power series, a way to collect the information of all derivatives of$~f$ evaluated at$~0$. Indeed (though it takes some work to see this) every formal power series occurs as the Taylor series of some function (in fact of infinitely many of them).


A standard example is $$ f(x)=\begin{cases}e^{-\frac{1}{x^2}}&x>0\\0&x\leq 0\end{cases}. $$ This function is smooth (it is actually infinitely differentiable), but if you compute the Taylor series at $0$, you'll find that all of the Taylor coefficients are zero! Therefore, this function has a Taylor series at $0$, which is convergent for all $x$, but isn't equal to the original function! This satisfies the statement that you mentioned that not all smooth functions are analytic.


Assume that $f$ is infinitely differentiable in a neighborhood of $0$. Then Taylor's theorem about $$f(x)=T_n(x)+R_n(x)\tag{1}$$ says that for fixed $n\geq0$ and $x\to0$ you have $$\lim_{x\to0}{R_n(x)\over|x|^n}=0\ .$$ Your question about the Taylor series $T_\infty(x)$ starts with $(1)$ as well, but it refers to the limit $n\to\infty$ for fixed $x$ instead. You want to know under which circumstances $$\lim_{n\to\infty} T_n(x)=f(x)\tag{2}$$ for all $x$ in a suitable neighborhood $U$ of $0$. Now $(2)$ is equivalent with $\lim_{n\to\infty}R_n(x)=0$ for each $x\in U$. The latter can only be proven by analyzing the error terms $R_n(x)$, using the various analytical forms available for $R_n(x)$ and, e.g., concrete estimates of the higher derivatives $f^{(n+1)}(x)$ for $x\in U$.


It is true that IF a function $f$ admits a power series expansion, THEN its associated Taylor series converges to $f$, and in fact the two series coincide (this is because power series can be differentiated term by term). But this does not tell us anything about the behaviour of a Taylor series if all that we know is that it exists - in particular, if we do not know if $f$ itself admits that power series expansion.