How local is the information of a derivative?
I have read it a thousand times: "you only need local information to compute derivatives." To be more precise: when you take a derivative, in say point $a$, what you are essentially doing is taking a limit, so you only need to look at the open region $ (a-\delta,a+\delta) $.
Taylor's theorem seems to contradict this: from the derivatives in just one point, you can reconstruct the whole function within its radius of convergence (which can be infinity).
For example, consider the function: $f: \mathbb{R} \rightarrow \mathbb{R}:x\mapsto \left\{ \begin{array}{lr} x+3\pi/2:& x \leq-3\pi/2 \\ \cos(x): & -3\pi/2\leq x \leq3\pi/2\\ x+3\pi/2& : x\leq-3\pi/2 \end{array} \right.\\$
Wolfram Alpha tells me that $D^{100}f(0)=\cos(0)$... This should give us more than enough information to get a Taylor expansion that converges beyond the point where $f$ is the $\cos$ function ($R=\infty$ for $\cos$ so eventually we have to get there) ...
Let me put it this way: Look at the limiting case. All you need to have for a Taylor expansion that converges over all the reals is all the derivatives in 0. This would give you the exact same Taylor expansion as you'd get for the cosine function, while the function from which we took the derivatives is clearly not the cosine function over all the reals.
So my question is: Is Wolfram Alpha wrong? If it is right, why does this seem to violate Taylors theorem? If it's wrong, is that because the local region of the domain you need to compute the nth derivative grows with n?
Edit 1: en.m.wikipedia.org/wiki/Taylor%27s_theorem. The most basic version of Taylors theorem for one variable does not mention analyticity, and it's easy to prove that the "remainder" goes to zero as you take more and more derivatives, so that f(x) is determined at any x by the derivatives of f in 0.
The subset of (infinitely many times) differentiable functions that actually coincide with their Taylor series is relatively small. Such a function is called "analytic", and we say that analytic functions are completely determined by local data at a point. Most functions that you encounter that have a name will be analytic, such as $\cos(x), e^x, \sqrt x$ and any polynomial, as well as products, sums and compositions of analytic functions.
There are functions that are infinitely many times continuously differentiable everywhere, and thus have a Taylor series at each point, but the Taylor series at a point $p$ might fail to approximate the function even relatively close to $p$. They are called $C^\infty$. The analytic functions are therefore a particularily nice subset of the $C^\infty$ functions.
Taylor's theorem requires a function, $f:\mathbb R \to \mathbb R$ that is "k-times differentiable" To capture a sine wave completely, your function would have to be infinitely differentiable on all of $\mathbb R$. It is not.