To what extent is the taylor polynomial the best polynomial approximation?
Here is a norm on $\mathscr C^n(a,b)$ for which $T_{n}(\cdot,x_0)$ is the best approximation to $f$: $$\|f\|_* = \sum_{k=0}^{n} |f^{(k)}(x_0)|+ \sup_{x\in[a,b]}|f^{(n)}(x)-f^{(n)}(x_0)|$$ This is a reasonable norm, which is equivalent to the more usual norms. For any polynomial $p$ of degree at most $n$ we have $$\|f-p\|_* = \sum_{k=0}^{n} |f^{(k)}(x_0)-p^{(k)}(x_0)|+ \sup_{x\in[a,b]}|f^{(n)}(x)-f^{(n)}(x_0)|$$ which is minimized exactly when $p=T_{n}(\cdot,x_0)$.
The answer is that the Taylor polynomial is not a very good approximation on the whole of $[a,b]$ in general. Indeed, the rest of the Taylor series converges to 0 on $[a,b]$ if and only if $f$ is analytic, which of course is not always the case. The intuition is that local information near $x_0$ only has no chance of being sufficient for a good approximation on $[a,b]$.
We know that a continuous function can be approximated uniformly on a segment by polynomials, but it is a bit tricky to find which polynomials exactly. Another natural candidate would be interpolation polynomials, but it turns out that they are no good as well (see http://en.wikipedia.org/wiki/Runge%27s_phenomenon). The answer is Bernstein's polynomials (http://en.wikipedia.org/wiki/Bernstein%27s_polynomial_theorem).