Settle a classroom argument - do there exist any functions that satisfy this property involving Taylor polynomials?

First, yes, in practical terms, it is very hard to define (indefinitely differentiable) functions that are non-analytic except by doing so piecewise. That's basically because all the usual "pieces" are themselves analytic in the interior of the region where they converge. This itself is an artifact of our history on this subject. In fact, some more-exotic (but standard for 100+ years) functions are not analytic... but their very definition depends on more complicated procedures, so would probably not be very satisfying, either.

A secondary but important point is that proving that the Taylor-Maclaurin series of a function at some point has infinite radius of convergence (or any other radius of convergence $>0$) does not in itself prove that the thing converges to the function whose Taylor-Maclaurin series it is. Rather, the error terms must go to zero. Of course, from our viewpoint, it takes considerable effort to arrange infinite radius of convergence but error terms not going to zero ... again because we must produce an exotic/un-natural function (from our artifactual viewpoint) one way or another, since "natural" (indefinitely differentiable) functions seem to be analytic.

Historically, indeed, many people (Euler, Lagrange) counted piecewise-defined functions as artificial, and not real functions. Sometimes, the very definition of "function" (in those days) was that the thing was representable by a power series. And, since most of the elementary functions we encounter do have such representations (which often requires proof... but sometimes ignorance is bliss), naively presuming that "all" functions have such expansions does not immediately lead to disaster... and, in fact is marvelously effective, because that is a correct assumption in many contexts.


This isn't quite your question, but maybe it'll clear up some confusion: Let $f(x) = \frac{1}{1-x}$ which has Maclauren series $1+x+x^2+x^3+\cdots.$ The ratio test shows where the series converges, (not whether the series converges) namely $-1<x<1$, but note that the series doesn't converge to the function if $x\geq 1$.

The sub is right in saying that you have to prove $R(x) \to 0.$ The ratio test is one way of finding out where that happens.


Short answer: the teacher is right as his example shows.

There is a very cute result that says given any sequence of real numbers $a_n$ there exists a smooth function $f$ with these as its Taylor series.

$$ | f(x) - \sum \limits_{j<n} a_j x^j| \leq C_n |x|^n $$ near 0.

So we could have $a_j = j^j$ and still get a Taylor series. The series would converge nowhere however.