Doubt about Taylor series: do successive derivatives on a point determine the whole function?
I'm currently relearning Taylor series and yersterday I thought about something that left me puzzled. As far as I understand, whenever you take the Taylor series of any function $f(x)$ around a point $x = a$, the function is exactly equal to its Taylor series, that is:
$$ f(x) = \sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!}(x-a)^n $$
For example, if we take $f(x) = e^x$ and $x = 0$, we obtain: $ e^x = \sum_{n=0}^{\infty} \frac{x^n}{n!} $
My doubt is: the only variables in the Tayor series formula are $f(a), f'(a), f''(a),$ etc., that is, the successive derivatives of the function $f$ evaluated in one point $x = a$. But the Taylor series of $f(x)$ determine the whole function! How is it possible that the successive derivatives of the function evaluated in a single point determine the whole function? Does this mean that if we know the values of $f^{(n)}(a)$, then $f$ is uniquely determined? Is there an intuition as to why the succesive derivatives of $f$ on a single point encode the necessary information to determine $f$ uniquely?
Maybe I'm missing a key insight and all my reasoning is wrong, if so please tell where is my mistake.
Thanks!
Solution 1:
You're right, in general $f$ is not determined by its derivatives at one single point. Functions satisfying this condition are called analytic. But not all smooth functions are analytic, for example
$$x\mapsto\left\{\begin{array}{c}e^{-\frac{1}{x^2}}, x>0\\0, x\leq 0\end{array}\right.$$ is a smooth function and the derivatives at zero are all zero, hence the Taylor series developed at zero does not determine the function.
Furthermore the exact statement of Taylor's theorem is quite different from what you said. It is as follows:
If $f\in C^{k+1}(\mathbb{R})$, then $$f(x)=\sum_{n=0}^k f^{(n)}(a)(x-a)^n\frac{1}{n!} + f^{(k+1)}(\xi)\frac{1}{(k+1)!}(x-a)^{k+1}$$
If you now take $k\rightarrow\infty$ it is in general not clear, that this error term converges to zero.
Solution 2:
Functions which are the sum of their Taylor series within the interval (or disk for functions of a complex variable) of convergence are known as analytic functions. Many basic elementary functions are analytic: $\;\exp, \sin,\cos,\sinh,\cosh $ and of course polynomials are analytic on $\mathbf R$ (or $\mathbf C$).
It is not true that, in general, an infinitely differentiable function of a real variable is analytic on the interval of convergence of its Taylor series, as @humanStampedist's example shows.
However, for a function of a complex variable, simply being differentiable suffices to ensure the function is analytic (one usually says holomorphic in this case). This is due to the very strong constraints of the Cauchy-Riemann equations.