"Continuized" Taylor Series? $\sin(x)=\sum \frac{(-1)^nx^{2n+1}}{(2n+1)!}=\int_{-1}^\infty \frac{\cos(\pi n) x^{2n+1}}{G(2n+1)}dn$?

~~not trying to reinvent the Laplace transform, but just an exploration into these particular series and integrals~~

Current answers don't fully address the 5 questions, so any new ideas or suggestions would be much appreciated. Thanks for the help!

The Taylor Series for $e^x$ is $$\sum \frac{x^n}{n!}$$ Now isn’t this just a discrete sum of functions? What if I use integrals to make a "continuous" version of the Taylor series? Following that motivation, I came up with $$E(x)=\int_{-\infty}^\infty \frac{x^n}{G(n)}dn$$ where $G(n)=\Gamma(n+1)=n!$. Since $\frac{x^n}{G(n)}$ goes to zero as $n\to -1$, the integral just becomes $$E(x)=\int_{-1}^\infty \frac{x^n}{G(n)}dn$$ I graphed it on Desmos and it looked like this, with the green dotted line being $E(x)$:

enter image description here

After that, I was like "wow, nice! I wonder if other functions work too”. Naturally, I moved on to $\sin(x)$, which has power series $$\sum \frac{(-1)^nx^{2n+1}}{(2n+1)!}$$ Unfortunately, this power series is more complicated because of the $(-1)^n$ term. The first thought I came up with is that $\cos(\pi n)$ could be the continuous version of that term. So, one such “continuized” version of the Taylor series for $\sin x$ would be $$S(x)=\int_{-1}^\infty \frac{\cos(\pi n) x^{2n+1}}{G(2n+1)}dn$$ Of course, that's kind of arbitrary, so I did use two other functions: $$c_1(x)=\cos^6\left(\frac{\pi x}{2}\right)-\sin^6\left(\frac{\pi x}{2}\right)$$ which is more "triangular", and $$c_2(x)=2\left(1-\sin^6\left(\frac{\pi x}{2}\right)\right)^6-1$$ which is more "square". You can see all three of these functions in orange. I made three integrals with each of the three functions, with the dotted green line with the one with $\cos(\pi x)$. I had to multiply it by a factor of $2$ to get it right, interestingly enough. In both integrals, I made the lower bound a bit higher to avoid crashing my computer and the higher bound low enough to make little to no difference.

enter image description here (accessible here: https://www.desmos.com/calculator/eesis3ykai, though it may take a while to load)

The first one has already been addressed here: The function $f(x) = \int_0^\infty \frac{x^t}{\Gamma(t+1)} \, dt$, so my only contribution is the nice graph. However, the $\sin x$ and $\cos x$ ones I found far more fascinating. Hence, I have a few questions:

(1) does $S(x)$ actually converge to $\sin(x)$?

(2) can we expect this "integral-Taylor series" to work on a lot of other functions? Is there some general result?

(3) why does $\cos(\pi x)$ work the best compared to the triangle and square waves? Why did the rectangular wave fail so badly?

(4) why does the integral have to be stretched by a factor of two, when the $e^x$ integral didn’t have to be?

(5) is this approximation for these functions useful? Can this method be applied elsewhere? Is there any use to this outside just "ooh look at this neat graph"?

More cool stuff: I did the same thing with $\cos x$, and I got similar results (the bounds get shifted a bit though): $$C(x)=\int_{-0.5}^\infty \frac{\cos(\pi n) x^{2n}}{G(2n)}dn$$ enter image description here (which is accessible here: https://www.desmos.com/calculator/ctjqdxuw0h)

which also has the strange multiplicative factor of $2$, and the peculiar favorability to $(-1)^n \approx \cos(\pi n)$. So my above 5 questions still stand.


Solution 1:

Here is a partial answer to some of the questions.

In this book, page 217 (see also this thread), it was established that $$ f(z)=\int_0^\infty\frac{z^tdt}{\Gamma(t+1)} = e^z+O(|z|^{-N}), $$ for any integer $N$. Putting $z=\pm ix$, and subtracting, we get $$ f(ix)-f(-ix)=2i\int_0^\infty\frac{\sin(\frac\pi2t)x^tdt}{\Gamma(t+1)} = 2i\sin x+O(|x|^{-N}), $$ or $$ S(x)=\int_0^\infty\frac{\sin(\frac\pi2t)x^tdt}{\Gamma(t+1)} = \sin x+O(|x|^{-N}). $$ Then the substitution $t=2s+1$ yields $$ S(x)=2\int_0^\infty\frac{\cos(\pi s)x^{2s+1}ds}{\Gamma(2s+2)} = \sin x+O(|x|^{-N}). $$ This is your formula up to the change of the integral lower limit from $-1$ to $0$. It answers Question 4 and Question 1 (in an asymptotic sense), and gives a strong support to the conjecture of Question 3.

An intuitive reason behind the factor 2 in Question 4 is that while the factor $\frac{x^{2s+1}}{\Gamma(2s+2)}$ in the integrand behaves mildly as a function of $s$, the other factor $\cos(\pi s)$ is oscillatory with average magnitude $\frac12$. On the other hand, the Taylor series could be approximated by the same integral with $\cos(\pi s)$ replaced by a rectangular pulse train with amplitude $\pm1$, and therefore with average magnitude $1$.

Solution 2:

In none of the cases you have put up is there a perfect equality. The functions you define are just approximately the same as the corresponding infinite series (it's easy to prove this; just select one single value for $x$, evaluate it numerically and compare to the expected result). I'll try to explain why this approximation works (which will answer some of your questions). Any integral can be written as a sum

$$\int_0^\infty f(n){\rm d}n = \sum_{n=0}^\infty g(n)~~~\text{where}~~~g(n) \equiv \int_n^{n+1}f(n'){\rm d}n'$$ If $f$ does not change too rapidly over $[n,n+1]$ then we will have $g(n)\approx f(n)$ so we end up with $$\int_0^\infty f(n){\rm d}n \approx \sum_{n=0}^\infty f(n)$$ which is what you are seeing in your examples.

This approximation usually gives better results if $f(n)$ is monotone (as we can more easily control the error in these cases) and it's likely to produce worse results if it's oscillatory (as there doesn't have to be any clear relationship between $g(n)$ and $f(n)$ or $f(n+1)$).

All what I wrote above is very rough, but it can be made more precise. For example if $f(x)$ is monotone decreasing then (this is the basis for the integral test of convergence) $$0 \leq \int_0^\infty f(n){\rm d}n - \sum_{n=0}^\infty f(n) < f(0)$$ and we also have an explicit formula, the Abel–Plana formula, relating such a sum and integral (thanks J.M. for reminding me of this) $$\sum_{n=0}^\infty f(n) - \int_0^\infty f(n)\,{\rm d}n = \frac{1}{2}f(0) + i\int_0^\infty\frac{f(it) - f(-it)}{e^{2\pi t}-1}\,{\rm d}t$$