Continuity of the roots of a polynomial in terms of its coefficients

Solution 1:

Here is a version of continuity of the roots.
Consider the monic complex polynomial $f(z)=z^n+c_1z^{n-1}+...+c_n\in \mathbb C[z]$ and factor it as $$f(z)=(z-a_1)...(z-a_n) \quad (a_k\in \mathbb C)$$ where the roots $a_k$ are arranged in some order, and of course needn't be distinct.
Then for every $\epsilon \gt 0$, there exists $\delta \gt 0$ such that every polynomial $ g(z) =z^n+d_1z^{n-1}+...+d_n\in \mathbb C[z]$ satisfying $|d_k-c_k|\lt \delta \quad (k=1,...,n)$ can be written $$g(z)=(z-b_1)...(z-b_n) \quad (b_k\in \mathbb C)$$
with $|b_k-a_k|\lt \epsilon \quad (k=1,...,n)$.

A more geometric version is to consider the Viète map $v:\mathbb C^n \to \mathbb C^n $ sending, in the notation above, $(a_1,...,a_n)$ to $(c_1,...,c_n)$ (identified with $z^n+c_1z^{n-1}+...+c_n=(z-a_1)...(z-a_n)$ ).
It is a polynomial map (and so certainly continuous!) since $c_k=(-1)^{k} s_k( a_1,...,a_n)$, where $s_k$ is the $k$-th symmetric polynomial in $n$ variables.
There is an obvious action of the symmetric group $S_n$ on $\mathbb C^n$ and the theorem of continuity of the roots states that the Viète map descends to a homeomorphism $w: \mathbb C^n / S_n \to \mathbb C^n$. It is trivial (by the definition of quotient topology) that $w$ is a bijective continuous mapping, but continuity of the inverse is the difficult part.
The difficulty is concentrated at those points $(c_1,...,c_n)$ corresponding to polynomials $z^n+c_1z^{n-1}+...+c_n$ having multiple roots.

This, and much more, is proved in Whitney's Complex Analytic Varieties (see App. V.4, pp. 363 ff).

Algebraic geometry point of view Since you are interested in general algebraically closed fields $k$, here is an interpretation for that case.
The symmetric group $S_n$ acts on $\mathbb A_k^n$ and the problem is whether the quotient set $\mathbb A_k^n /S_n$ has a reasonable algebraic structure. The answer is yes and the Viète map again descends to an isomorphism of algebraic varieties $\mathbb A_k^n /S_n \stackrel {\sim }{\to} \mathbb A_k^n $.
This is the geometric interpretation of the fundamental theorem on symmetric polynomials.
The crucial point is that the symmetric polynomials are a finitely generated $k$-algebra.

Hilbert's 14th problem was whether more generally the invariants of a polynomial ring under the action of a linear group form a finitely generated algebra. Emmy Noether proved in 1926 that the answer is yes for a finite group (in any characteristic), as illustrated by $S_n$.
However Nagata anounced counterexamples (in all characteristics) to Hilbert's 14th problem at the International Congress of Mathematicians in 1958 and published them in 1959.

Solution 2:

I think there might be a proof of your statement using the following complex analysis trick (I don't know if a similar idea could work in $\mathbb{C}_p)$: if $U$ is an open subset with smooth boundary $\partial U$ consider,

$$N_{U}(p) = \frac{1}{2i \pi} \oint_{\partial U} \frac{p'(z)}{p(z)}dz$$

When it's defined, $N_U(p)$ is the number of zeros of $p$ in $U$ counted with multiplicity. Then fix a polynomial $p_0$ of degree $n$, and pick $U$ a neighborhood of its zeros. Then the map $p \mapsto N_U(p)$ is well defined and continuous in a neighborhood of $p_0$, but since it can only take integer values, it's constant and equal to $n$. So if $p$ in that neighborhood has degree $n$, all its roots are in $U$.

Solution 3:

I posed this as a problem in a course on local fields I taught a little while ago. One of my students, David Krumm, solved it and wrote it up here. The context of David's solution is that $K$ is an arbitrary normed field, with some chosen extension of the norm to the algebraic closure of $K$. (If $K$ is complete or even Henselian, the norm extends uniquely; in general it does not.) Then he shows that for every $\epsilon > 0$ there exists some $\delta > 0$ so that if you perturb each of the coefficients of your polynomial $f$ by at most $\delta$, every root of $f_{\delta}$ is wtihin $\epsilon$ of some root of $f$ and vice versa. (I didn't think of this until just now, but I guess this is equivalent to saying that the sets of roots are within $\epsilon$ of each other for the Hausdorff metric.) He also shows that if $f$ itself has distinct roots, then for sufficiently small $\delta$ so does $f_{\delta}$ and then you can match up the roots in a canonical way.

After he solved this problem I looked into the literature and found a dozen papers or more on various refinements of it, including some very recent ones. At the moment these papers seem to be hiding from me, but if/when I find them I'll give some references.

Solution 4:

Suppose $P_a(z)=\sum\limits_{k=0}^na_kz^k$. Taking the partial of $P_a(z)=0$ with respect to $a_k$, we get $$ 0=P_a^{\;\prime}(z)\frac{\partial z}{\partial a_k} + z^k $$ Thus, we get that $$ \frac{\partial z}{\partial a_k}=-\frac{z^k}{P_a^{\;\prime}(z)} $$ The existence of these partial derivatives are guaranteed by the Inverse Function Theorem.

Thus, as long as $P_a^{\;\prime}(z)\ne0$ when $P_a(z)=0$, $\frac{\partial z}{\partial a_k}$ will exist and be finite. Therefore, if $P_a$ has no repeated roots, $\frac{\partial z}{\partial a_k}$ is finite.

This shows that unless $P_a$ has repeated roots, each root is a smooth function of the coefficients.