Quick answer: you're not going to find the roots in any quicker way with this method. Remember that in general, for polynomials of degree 5 or more, you cannot find explicit formulas for the roots. You simply cannot. With this method or another.

Now, what is Vieta's theorem? It is in fact just expanding the product $$ a_n \prod_{i=1}^n (x - r_i) = a_nx^n + \cdots + a_1x+a_0$$


Numerically, however, for a monic polynomial $p(x)=x^n+c_{n-1} x^{n-1}+\cdots+c_1 x+c_0$, one can treat the Vieta equations relating the $n$ roots $x_k$ and the $n$ remaining coefficients $c_j$ as a system of simultaneous nonlinear equations, and then apply the multivariate version of Newton-Raphson on them.

To wit, note that the Jacobian of the system (I use the quartic case here to keep things simple)

$$\begin{align*}x_1+x_2+x_3+x_4=&-c_3\\x_1 x_2+x_3 x_2+x_4 x_2+x_1 x_3+x_1 x_4+x_3 x_4=&c_2\\x_1 x_2 x_3+x_1 x_4 x_3+x_2 x_4 x_3+x_1 x_2 x_4=&-c_1\\x_1 x_2 x_3 x_4=&c_0\end{align*}$$

is

$$\begin{split}&\mathbf J(\mathbf x)=\mathbf J(x_1,x_2,x_3,x_4)=\\&\small\begin{pmatrix} 1 & 1 & 1 & 1 \\ x_2+x_3+x_4 & x_1+x_3+x_4 & x_1+x_2+x_4 & x_1+x_2+x_3 \\ x_2 x_3+x_4 x_3+x_2 x_4 & x_1 x_3+x_4 x_3+x_1 x_4 & x_1 x_2+x_4 x_2+x_1 x_4 & x_1 x_2+x_3 x_2+x_1 x_3 \\ x_2 x_3 x_4 & x_1 x_3 x_4 & x_1 x_2 x_4 & x_1 x_2 x_3\end{pmatrix}\end{split}$$

with inverse

$$\begin{split}&\mathbf J^{-1}(\mathbf x)=\mathbf J^{-1}(x_1,x_2,x_3,x_4)=\\&\tiny \begin{pmatrix}\frac{x_1^3}{(x_1-x_2) (x_1-x_3) (x_1-x_4)} &\frac{x_1^2}{(x_1-x_2) (x_1-x_3) (x_4-x_1)} &\frac{x_1}{(x_1-x_2) (x_1-x_3) (x_1-x_4)} & \frac1{(x_2-x_1)(x_1-x_3) (x_1-x_4)} \\ \frac{x_2^3}{(x_2-x_1) (x_2-x_3) (x_2-x_4)} &\frac{x_2^2}{(x_1-x_2) (x_2-x_3) (x_2-x_4)} &\frac{x_2}{(x_2-x_1) (x_2-x_3) (x_2-x_4)} & \frac1{(x_1-x_2) (x_2-x_3) (x_2-x_4)} \\\frac{x_3^3}{(x_1-x_3) (x_2-x_3) (x_3-x_4)} &\frac{x_3^2}{(x_1-x_3) (x_3-x_2) (x_3-x_4)} &\frac{x_3}{(x_1-x_3) (x_2-x_3) (x_3-x_4)} & \frac1{(x_1-x_3)(x_3-x_2) (x_3-x_4)} \\\frac{x_4^3}{(x_4-x_1) (x_4-x_2) (x_4-x_3)} &\frac{x_4^2}{(x_1-x_4) (x_4-x_2) (x_4-x_3)} &\frac{x_4}{(x_4-x_1) (x_4-x_2) (x_4-x_3)} & \frac1{(x_1-x_4)(x_4-x_2) (x_4-x_3)}\end{pmatrix}\end{split}$$

and thus the Newton-Raphson iteration function looks like this:

$$\mathbf x-\mathbf J^{-1}(\mathbf x)\begin{pmatrix}s_1+c_3\\s_2-c_2\\s_3+c_1\\s_4-c_0\end{pmatrix}$$

or explicitly,

$$\begin{align*}x_1-&\frac{c_3 x_1^3+c_2 x_1^2+c_1 x_1+c_0+x_1^4}{(x_1-x_2) (x_1-x_3)(x_1-x_4)}\\x_2-&\frac{c_3 x_2^3+c_2 x_2^2+c_1 x_2+c_0+x_2^4}{(x_2-x_1)(x_2-x_3) (x_2-x_4)}\\x_3-&\frac{c_3 x_3^3+c_2 x_3^2+c_1 x_3+c_0+x_3^4}{(x_3-x_1) (x_3-x_2) (x_3-x_4)}\\x_4-&\frac{c_3 x_4^3+c_2 x_4^2+c_1 x_4+c_0+x_4^4}{(x_4-x_1) (x_4-x_2) (x_4-x_3)}\end{align*}$$

This application of the multivariate Newton-Raphson method to the Vieta formulae is called the Durand-Kerner algorithm for simultaneously determining the roots of a polynomial:

$$x_i^{(k+1)}=x_i^{(k)}-\frac{p(x_i^{(k)})}{\prod\limits_{j\neq i} (x_i^{(k)}-x_j^{(k)})},\qquad i=1\dots n;\; k=0,1,\dots$$

where the $x_i^{(0)}$ are initial approximations (which, as with any Newton-Raphson method, are required). As with the usual Newton-Raphson method, it is quadratically convergent. It is however remarkable that the method is not as finicky with initial conditions as is usual with Newton-Raphson. (In practice, however, one usually takes equally spaced points around a circle in the complex plane as a starting point for the method, where the radius is determined from bounds for the roots.)

See Kerner's original paper for more details of the derivation.


The algorithm is also sometimes referred to as the Weierstrass-Durand-Kerner method, since Karl Weierstrass used this in his constructive proof of the fundamental theorem of algebra. See Weierstrass's paper for more on this.


You cannot use the symmetric functions to find the roots despite a natural urge to do so. It is instructive to try it and find yourself back where you started from. Try it with a quadratic equation.