Intuitive Explanation why the Fundamental Theorem of Algebra fails for infinite sums
Solution 1:
The topological argument for FTA is that for $R$ large, the circle $|z|=R$ is sent to a path the "winds around zero" much like $z^n$, in particular, it "goes around"zero $n$ time counterclockwise.
When $R$ is small, the loop under the image of $f$ winds around $f(0)$ and never gets even once around zero if $f(0)\neq0$, so it "winds" around zero not at all.
Trying to imagine continuously deforming a small loop that winds around zero never into a loop that winds around zero $n$ times without ever crossing zero.
This is actually a generalization of the intermediate value theorem to two-dimensions.[*] In particular, if a continuous function - any continuous function, not just analytic functions - maps a closed disk $D\to \mathbb C$ so that the boundary circle is sent to a map that "winds around zero" a non-zero number of times, then $f$ has a zero inside $D$.
This argument can be made rigorous when you get to topology. Defining "winds around" is non-obvious - you need to treat a path that goes around once clockwise and once counter-clockwise as not going around at all, for example.
When $f(x)$ is not a polynomial, you don't ever get that it winds at all, because you can't show that it is "close enough to" some function of the form $az^n$ that winds around zero.
It's instructive to see what happens to the function $e^z$ when $|z|=R$. Taking $z=Re^{ix}=R\cos x+iR\sin x$ we get the argument of $e^{z}$ is $R\sin x$. And, indeed, this "winds around" zero $R$ radians counterclockwise first, then turns $2R$ radians clockwise, then turns $R$ degrees counter-clockwise So there is "zero" turns, total, even though the path takes huge swings around zero.
It can be shown that if $f(z)$ has no zeros on a disk $D$, then there is a continuous logarithm to $f(z)$ - that is, $f(z)=e^{g(z)}$ for some $g(z)$. If $f$ is analytic, then $g$ is analytic.
In terms of complex analysis, this all hinges on the fact that the logarithm function on the complex plain "branches" as you go around the circle. This will be more obvious when you see the Riemann surface view of branching functions - it is geometric, but it takes a bit of work to express it at beginning complex analysis.
[*] In the real line, a disk is just an interval $[a,b]$ and the boundary is the pair of numbers $a,b$. So "winds" is the wrong term there, but if $0$ is between $f(a)$ and $f(b)$ then there has to be an $x\in[a,b]$ so that $f(x)=0$.
Solution 2:
The infinity of roots is there (think of the roots of $\sin(z)$), but in the case of the exponential they escape to infinity.
Below a plot of the roots of the Taylor approximations of $e^x$ to degrees $5$, $7$ and $10$.
Also think of the approximations $\left(1+\dfrac zn\right)^n$, having a root of multiplicity $n$ at $z=-n$.
Solution 3:
The key difference between (genuine, finite-degree) polynomials and Taylor series is that the former have leading terms. In particular, we can prove:
If $p(x)$ is a polynomial, then for all $N\in\mathbb{R}$ there is some $\delta$ such that, for all $z\in\mathbb{C}$, if $\vert z\vert>\delta$ then $\vert p(z)\vert> N$.
That is, all polynomials "shoot off to infinity." This is proved by looking at the leading term of $p$, and showing that it eventually dominates the rest of $p$.
Now, with this in hand, the proof of FTA is easy via complex analysis:
Liouville's theorem tells us that any bounded entire function is constant.
If $p(x)$ is a polynomial with no zeroes, then $f(x)={1\over p(x)}$ is entire.
By using the fact cited above, $f(x)$ is in fact bounded, since it is "small" outside of some compact set.
So $f$ is constant.
BUT, the fact cited above is complete nonsense for functions which aren't polynomials, since the absence of a leading term means their behavior can go any which way: for instance, look at $\lim_{x\rightarrow\infty}e^{-x}$ . . .
Solution 4:
The Fundamental Theorem of Algebra just posits the existence of zeros, not their location. Sequences of polynomials may have sequences of zeros which all diverge to infinity.
For $e^x$, the roots of the Taylor polynomials $f_n$ all grow in magnitude as $n$ grows. The classic paper appears to be
G. Szego, Uber eine Eigenschaft der Exponentialreihe, Sitzungsberichte der Berliner Math. Gesellschaft 21 (1922) 50-64; also in Collected Papers, vol. 1, Birkhauser, Boston, 1982, pp. 645-662.
Reference taken from this paper where the result is stated on the first page:
The zeros $z_n$ of the $n^{th}$ Taylor polynomial for $\exp(z)$ divided by $n$, i.e. $\frac{z_n}{n}$, cluster around the curve $$K=\{z: |ze^{1-z}|= 1, |z| \leq 1\}.$$
Since $\frac{z_n}{n}$ is near $K$, then $\left|\frac{z_n}{n}\right|\approx 1$ for large $n$. We conclude that as $n\rightarrow\infty$, the zeros grow unboundedly in magnitude.
Simple intuitive reasoning:
Assume that Taylor polynomials $f_n(z)\rightarrow f(z)$ for all $z\in\mathbb{C}$. The zeros of $f_n$ correspond with the zeros of $$\frac{f_n(z)}{a_n}=g_n(z)=z^n+\frac{a_{n-1}}{a_n}z^{n-1}+\cdots+\frac{a_{0}}{a_n}.$$
The coefficients $\frac{a_k}{a_n}$ get large as $n$ gets large. So it is reasonable, that there exists some sequence of Taylor polynomials that requires all of its roots to eventually have very large real and/or imaginary parts in order for all terms to cancel out to zero. So the roots for $f_n$ will be outside some disc $D_n$ centered at the origin which grows in size. Of course, $e^x$ is one such example.
Some sequences of Taylor series behave differently, e.g. $\sin(z)$. In this case, it appears that a growing number of zeros go off to infinity, but others converge to integer multiples of $\pi$. So, in a sense, $\sin(z)$ has an infinite number of finite zeros and an infinite number or zeros at infinity. Of course, the concept of zeros at infinity is not meant to be taken literally as correct mathematics!