Is it impossible to perfectly fit a polynomial to a trigonometric function on a closed interval?

On a closed interval (e.g. $[-\pi, \pi]$), $\cos{x}$ has finitely many zeros. Thus I wonder if we could fit a finite degree polynomial $p:\mathbb{R} \to \mathbb{R}$ perfectly to $\cos{x}$ on a closed interval such as $[-\pi, \pi]$.

The Taylor series is

$$\cos{x} = \sum_{i=0}^{\infty} (-1)^i\frac{x^{2i}}{(2i)!} = 1 - \frac{x^2}{2} + \frac{x^4}{4!} - \frac{x^6}{6!} + \frac{x^8}{8!}-\dots$$

Using Desmos to graph $\cos{x}$ and $1-\frac{x^2}{2}$ yields:

cosine x and first 2 terms of its Taylor series

which is clearly imperfect on $[-\pi,\pi]$. Using a degree 8 polynomial (the first 5 terms of the Taylor series above) looks more promising:

cosine x and first 5 terms of its Taylor series

But upon zooming in very closely, the approximation is still imperfect:

cosine x and first 5 terms of its Taylor series near x=pi

There is no finite degree polynomial that equals $\cos{x}$ on all of $\mathbb{R}$ (although I do not know how to prove this either), but can we prove that no finite degree polynomial can perfectly equal $\cos{x}$ on any closed interval $[a,b]\subseteq \mathbb{R}$? Would it be as simple as proving that the remainder term in Taylor's Theorem cannot equal 0? But this would only prove that no Taylor polynomial can perfectly fit $\cos{x}$ on a closed interval...


Yes, it is impossible.

Pick any point in the interior of the interval, and any polynomial. If you differentiate the polynomial repeatedly at that point, you will eventually get only zeroes. This doesn't happen for the cosine function, which instead repeats in an infinite cycle of length $4$. Thus the cosine function cannot be a polynomial on a domain with non-empty interior.


We don't even need to differentiate many times. Just note that $f'' = -f$ is satisfied by $f = \cos$ but not if $f$ is a non-zero polynomial function because $f''$ has lower degree than $f$. (This implicitly uses the fact that two polynomials that are equal at infinitely many points must be identical.) $ \def\lfrac#1#2{{\large\frac{#1}{#2}}} $

To answer a comment on Claude's post, here is a neat proof. Define $\deg(\lfrac{g}{h}) = \deg(g)-\deg(h)$ for any polynomial functions $g,h$. Given any function $f = \lfrac{g}{h}$ where $g,h$ are polynomial functions on some non-trivial interval, we have $f' = \lfrac{g'}{h}-\lfrac{g·h'}{h^2} = f·\lfrac{g'·h-g·h'}{g·h}$, and hence $\deg(f') < \deg(f) $ since $\deg(g'·h-g·h') < \deg(g·h)$. Thus $\deg(f'') < \deg(f)$ and therefore $f'' ≠ -f$. So even Padé approximants are not enough to perfectly fit anything except rational functions, on any non-trivial interval.


Here's a proof using only basic trigonometry and algebra, no calculus or infinite series required.

We'll do a proof by contradiction. Suppose $\cos(x)$ is a polynomial on some closed interval $[a,b]$, with $a\ne b$. We'll split it into two cases, depending on whether or not $0\in [a,b]$.

Case 1. Suppose your interval contains the origin, i.e. $a \le 0 \le b$. If $\cos(x)$ is a polynomial function on $[a,b]$, then $2\cos^2(\frac x 2) - 1$ is also a polynomial function on $[a,b]$, since $x\in[a,b]$ implies $x/2 \in [a,b]$. Now, recall the half angle formula for $\cos(x)$:$$ \cos(x) = 2\cos^2(\frac x 2) - 1 $$ The half-angle formula tells us that these two polynomials are in fact the same polynomial. But if $\cos(x)$ has degree $n$, then $2\cos^2(\frac x 2) - 1$ must have degree $2n$. Since two polynomials with different degree cannot be equal on any interval, this implies $2n = n$, or $n=0$. Since $\cos(x)$ is not constant, we have a contradiction, so $\cos(x)$ is not a polynomial on any interval containing $0$.

Case 2. Now, what if the interval does not contain the origin? This takes a few more steps, but we can show that if $\cos(x)$ is a polynomial on $[a,b]$, then it must also be a polynomial (potentially a different polynomial) on $[0,b-a]$, which contains the origin so is impossible by the above argument.

For $x\in [0,b-a]$, we use the angle sum formula to find $$ \cos(x) = \cos(x+a -a) = \cos(x+a)\cos(a) + \sin(x+a)\sin(a) $$ Since $\cos(x+a)$ is a polynomial of $x$, and $\sin(x+a)^2 + \cos(x+a)^2= 1$, this means that on the interval $[0,b-a]$, the cosine of $x$ has the property that $$ \left(\cos(x) - p(x)\right)^2 = q(x) $$ for some polynomials $p$ and $q$. In particular $p(x) = \cos(a+x)\cos(a)$ and $q(x) = \sin^2(a) \left(1-\cos^2(x+a)\right)$. Equivalently, $\cos(x) = p(x) \pm \sqrt{q(x)}$. Again, the half-angle formula tells us $\cos x = 2\cos^2(\frac x 2) - 1$ (for $x\in[0,b-a]$). Substituting into the above, we get some very messy algebra:\begin{eqnarray} \left(2\cos^2\left(\frac x 2\right) - 1 - p(x)\right)^2 &=& q(x)\\ \left(2p(\frac x 2)^2 \pm 4 p(\frac x 2)\sqrt{q(\frac x 2)} + 2q(\frac x 2) - 1 - p(x)\right)^2 &=& q(x)\end{eqnarray} expanding the left-hand side, we get:$$ q(x) = \left(2p(\frac x 2)^2+ 2q(\frac x 2) - 1 - p(x)\right)^2 + 16 p(\frac x 2)^2q(\frac x 2) \pm 8\left(2p(\frac x 2)^2+ 2q(\frac x 2) - 1 - p(x)\right)p(\frac x 2)\sqrt{q(\frac x 2)} $$ which implies $\pm\sqrt{q(x/2)}$ is actually a rational function. Since its square is a polynomial, this means $\pm\sqrt{q(x/2)}$ is a polynomial itself, so $\pm\sqrt{q(x)}$ is also a polynomial. Therefore $\cos(x) = p(x) \pm \sqrt{q(x)}$ is a polynomial for $x\in[0,b-a]$. Since this interval contains the origin, we again have a contradiction, so $\cos(x)$ cannot be a polynomial on $[a,b]$.


As an addendum: All of these arguments can be generalized to show that $\cos(x)$ is also not a rational function on any interval, and that the other trig functions similarly are not polynomials or rational functions.


If $p$ is a polynomial the function $f(z) = p(z)-\cos z$ is entire, and the uniqueness theorem shows that if $f(z) = 0$ on any line segment then $f= 0$.

(The uniqueness theorem is stronger than that, it just needs $f$ to be zero on any sequence with an accumulation point.)

Addendum:

To clarify, since a non zero polynomial has at most $\partial p$ zeros and $\cos$ has a countable number then we cannot have $f=0$.


I do not know if you have any specific reason to require a polynomial.

Nevertheless, for function approximations, Padé approximants are much better than Taylor expansions even if, to some extent, they look similar. For example $$\cos(x) \sim \frac {1-\frac{115 }{252}x^2+\frac{313 }{15120}x^4 } {1+\frac{11 }{252}x^2+\frac{13 }{15120}x^4 }$$ is better than the Taylor series to $O(x^{9})$ that you considered

To compare $$\int_{-\pi}^\pi \Big[ \frac {1-\frac{115 }{252}x^2+\frac{313 }{15120}x^4 } {1+\frac{11 }{252}x^2+\frac{13 }{15120}x^4 }-\cos(x)\Big]^2\,dx=0.000108$$ $$\int_{-\pi}^\pi \Big[1-\frac{x^2}{2}+\frac{x^4}{24}-\frac{x^6}{720}+\frac{x^8}{40320}-\cos(x)\Big]^2\,dx=0.000174$$ but nothing is absolutely perfect.

If I add one more term to the Padé approximant, the values of the corresponding integral become $1.25\times 10^{-9}$ and for $x=\frac \pi 2$ the value of the approximated function is $-6.57\times 10^{-9}$.

Now, have a look at an approximation I built for you $$\cos(x)=\frac{1-\frac{399 }{881}x^2+\frac{20 }{1037}x^4 } {1+\frac{58 }{1237}x^2+\frac{1}{756}x^4 }$$ which gives for the integral $1.49\times 10^{-8}$.