Why doesn't the Stone-Weierstrass theorem imply that every function has a power series expansion?

I know that not every function has a power series expansion. Yet what I don't understand is that for every $C^{\infty}$ functions there is a sequence of polynomial $(P_n)$ such that $P_n$ converges uniformly to $f$. That's to say :

$$\forall x \in [a,b], f(x) = \lim_{n \to \infty} \sum_{k = 0}^{\infty} a_{k,n}x^k$$

But then because it converges uniformly why can't I say that :

$$\forall x \in [a,b], f(x) = \sum_{k = 0}^{\infty} \lim_{n \to \infty} a_{k,n}x^k$$

And so $f$ has a power series expansion with coefficients: $\lim_{n \to \infty} a_{k,n}x^k$.


Short answer. The sequence of polynomials guaranteed by the Stone Weierstrass theorem may not be constructible by appending terms of higher and higher order. The early coefficients can vary as the sequence grows. So you don't have the sequence of partial sums of a power series.


$$\lim_{n\to\infty}\left(\lim_{k\to\infty} a_{n,k}\right)$$ is, in general, not the same as $$\lim_{k\to\infty}\left(\lim_{n\to\infty} a_{n,k}\right)$$ and in order to switch the order of your infinite sum (which is in its definition a limit) and your limit, you would need something like that.


I'm not sure what sense you can give to $$ \lim_{n\to\infty}a_{k,n} $$ There's nothing in the Stone-Weierstrass that can even hint to the existence of such limit.

The traditional example of a $C^\infty$ function that's not everywhere analytic well illustrates this. Try and find a polynomial approximation for $$ f(x)=\begin{cases} 0 & x=0 \\[4px] e^{-1/x^2} & x\ne0 \end{cases} $$ and you'll realize that the exchange is not possible, because the limit under scrutiny cannot exist.


Yes, but because I have the uniform convergence the switch of limits should work?

As far as I understood correctly the theorem only says that the entire expression converges uniformly. It does not say anything about the convergence of each element inside the sum.

However a sum of functions can be uniformly convergent even if the functions inside the sum are not converging at all!

Counterexample:

$g_{n,k}(x)=\begin{cases} nx\text{ if }k=1\\ (-nx)\text{ if }k=2\\ 0\text{ else}\\ \end{cases}$

$f_n(x)=\sum\limits_{k=0}^{\infty}g_{n,k}(x)$

Then:

$f_n(x)=(nx-nx)=0$ which means that the value of $f_n(x)$ is independent of $n$ which means that all elements in the function series are the same function and therefore the function series converges uniformly to $f(x)=0$.

Please also note that $\lim\limits_{x\to\pm\infty}f(x)=0$ in this example.

The single elements in the sum ($g_{n,1}(x)=nx$ and $g_{n,2}(x)=-nx$) however are diverging both for $n\to\infty$ and for $x\to\pm\infty$.