I was wondering if the following properties of the Legendre polynomials are true in general. They hold for the first ten or fifteen polynomials.

  1. Are the roots always simple (i.e., multiplicity $1$)?

  2. Except for low-degree cases, the roots can't be calculated exactly, only approximated (unlike Chebyshev polynomials).

  3. Are roots of the entire family of Legendre Polynomials dense in the interval $[0,1]$ (i.e., it's not possible to find a subinterval, no matter how small, that doesn't contain at least one root of one polynomial)?

If anyone knows of an article/text that proves any of the above, please let me know. The definition of these polynomials can be found on Wikipedia.


To resolve the second question, note first that the Legendre polynomials are odd functions for odd order (0 then is one root of the polynomial), and even functions for even order. Thus, with regards to solubility in terms of radicals, you should be able to derive (possibly complicated!) radical expressions at least up until $P_9(x)$. To use that as an example, note that

$$\frac{P_9(\sqrt{x})}{\sqrt{x}}$$

is a quartic; thus, one can use the quartic formula to derive explicit expressions for its roots, and then you can easily derive the roots of $P_9(x)$ .

$P_{10}(x)$ is where your trouble starts. If we take a look at the polynomial

$$P_{10}(\sqrt{x})$$

we have a quintic to contend with. I'll skip the relatively tedious details, but you can verify that its Galois group is not a solvable group, and thus the solution cannot be expressed in terms of radicals (you can use theta or hypergeometric functions, though).

So, not much hope in the symbolic front. In the numeric front, things are much easier. The slickest way of getting accurate values of the roots of the Legendre polynomial is to use the Jacobi matrix in my previous answer. Since there exist stable and efficient algorithms (e.g. QR algorithm or divide-and-conquer) for the symmetric eigenproblem (in LAPACK, for instance), and things can be set such that only eigenvalues are returned, you have a good way of generating good approximate values of Legendre polynomial roots. (In the context of Gaussian quadrature, where the roots of orthogonal polynomials play a pivotal role, the scheme is referred to as the Golub-Welsch algorithm.)

Alternatively, as I mentioned in the comments, there exist asymptotic approximations for the roots, which can then be subsequently polished with a few applications of Newton-Raphson. One such asymptotic approximation is due to Francesco Tricomi. Letting $\xi_{n,k}$ be the $k$-th root of $P_n(x)$, ordered in decreasing order, we have

$$\xi_{n,k}\approx\left(1-\frac1{8n^2}+\frac1{8n^3}\right)\cos\left(\pi\frac{4k-1}{4n+2}\right)$$

and $O(n^{-4})$ and further terms are omitted. Other asymptotic approximations due to Luigi Gatteschi use roots of Bessel functions, but I won't say more about those.


I'll answer question 1 only for now, but I might edit this to address the others later.

One should note that corresponding to any set of orthogonal polynomials, there exists a symmetric tridiagonal matrix, called a Jacobi matrix, whose characteristic polynomial is the monic (leading coefficient is 1) version of the set of orthogonal polynomials considered. To use the Legendre polynomials as an explicit example, we first note that the monic Legendre polynomials satisfy the following two-term recurrence relation:

$$\hat{P}_{n+1}(x)=x \hat{P}_n(x)-\frac{n^2}{4 n^2-1}\hat{P}_{n-1}(x)$$

where $\hat{P}_n(x)=\frac{(n!)^2 2^n}{(2n)!}P_n(x)$ is the monic Legendre polynomial.

From this, we can derive an explicit expression for the corresponding Jacobi matrix (here I give the 5-by-5 case):

$$\begin{pmatrix}0&\frac{1}{\sqrt{3}}&0&0&0\\\frac{1}{\sqrt{3}}&0&\frac{2}{\sqrt{15}}&0&0\\0&\frac{2}{\sqrt{15}}&0&\frac{3}{\sqrt{35}}&0\\0&0&\frac{3}{\sqrt{35}}&0&\frac{4}{\sqrt{63}}\\0&0&0&\frac{4}{\sqrt{63}}&0\end{pmatrix}$$

(the general pattern is that you have $\frac{n}{\sqrt{4 n^2-1}}$ in the $(n,n+1)$ and $(n+1,n)$ positions, and 0 elsewhere.)

We now note that $\frac{n}{\sqrt{4 n^2-1}}$ can never be 0, and then use the fact that if a symmetric tridiagonal matrix has no zeroes in its sub- or superdiagonal, then all its eigenvalues have multiplicity 1. (A proof of this fact can be found in Beresford Parlett's The Symmetric Eigenvalue Problem.) Thus, all the roots of the Legendre polynomial are simple roots.

A more conventional proof of this fact is in page 27 of Theodore Chihara's An Introduction to Orthogonal Polynomials. Briefly, the argument is that $P_n(x)$ changes sign at least once within $[-1,1]$ (and thus has at least one zero of odd multiplicity within the support interval) since

$$\int_{-1}^1 P_n(u)\mathrm du=0$$

Now, the polynomial

$$P_n(x)\prod_{j=1}^k(x-\xi_j)$$

where the $\xi_j$ are the distinct zeroes of odd multiplicity within $[-1,1]$, should be greater than or equal to zero within $[-1,1]$, and thus its integral over $[-1,1]$ should be greater than zero. However, since

$$\int_{-1}^1 P_n(u) u^k\mathrm du=0\qquad\text{if}\qquad k < n$$

we have a contradiction, and thus all the roots of the Legendre polynomial are simple (and within the support interval $[-1,1]$).


The density of the roots of any family of orthogonal polynomials follows from this result:

If $\{p_n\}$ is a family of orthogonal polynomials with roots in $[-1,1]$ and $N(a,b,n)$ represents the number of roots of $p_n$ in $[\cos(b),\cos(a)]$ then

$$\lim_{n\to \infty} \frac{N(a,b,n)}{n} = \frac{b-a}{\pi}$$