How do we know the Taylor expansion for $e^x$ works for all $x$? Or that it's analytic?

Let's say I want to use Maclaurin series to get the series expansion $S(x)$ for $f(x) = e^x$ where $S(x) = c_0x^0 + c_1x^1 + c_2x^2 + c_3x^3 + ...$

$f(0) = S(0) = c_0 = e^0 = 1$ so that's fine.

$f'(0) = S'(0) = c_1 = e^0 = 1$ again, fine.

$f''(0) = S''(0) = 2c_2 = e^0 = 1$ so $c_2 = 1/2$, fine.

$f'''(0) = S'''(0) = 6c_3 = e^0 = 1$ so $c_3 = 1/6$, fine.

$f''''(0) = S''''(0) = 24c_4 = e^0 = 1$ so $c_4 = 1/24$, fine.

And so on, so we conclude that $c_k = \frac{1}{k!}$ so then:

$$S(x) = \sum_{k=0}^{\infty} \frac{x^k}{k!}$$

But now what I don't understand is what allows us to go "Furthermore, $S(x) = f(x)$ for all $x$"! In fact what makes it valid for us to plug any other number other than $x=0$ into this? I see this equation used as a straight-up equivalent to $e^x$ even though we used $x=0$ and nothing else.

So I thought I would try it with a neighborhood of $1$ instead to see what happens:

$S(x) = c_0(x-1)^0 + c_1(x-1)^1 + c_2(x-1)^2 + c_3(x-1)^3 + ...$

$f(1) = S(1) = c_0 = e^1 = e$ so that's fine.

$f'(1) = S'(1) = c_1 = e^1 = e$ again, fine.

$f''(1) = S''(1) = 2c_2 = e^1 = e$ so $c_2 = e/2$, fine.

$f'''(1) = S'''(1) = 6c_3 = e^1 = e$ so $c_3 = e/6$, fine.

$f''''(1) = S''''(1) = 24c_4 = e^1 = e$ so $c_4 = e/24$, fine.

Looks pretty similar:

$$S(x) = \sum_{k=0}^{\infty} \frac{e(x-1)^k}{k!}$$

and, in fact, for a neighborhood of $x=a$:

$$S(x) = \sum_{k=0}^{\infty} \frac{e^a(x-a)^k}{k!}$$

Now we have $e$ inside the function expansion for $e^x$ which seems a little circular?

I guess my question is this: We created the function $S$ to have the same-valued $n$th derivatives as $f(x)$ but only at $x=0$ or $x=1$, or $x=a$, etc, but the series representations look different depending on which neighborhood I pick, and in some cases the expansions seem to include the very number we're trying to describe.

How do we know which is "right" or that it is even "right" to use for all $x$? I know the usual response to this is that it's equivalent when $f(x)$ is analytic but that doesn't help me at all because it says the function is analytic when the Taylor series around $x_0$ converges to the function in a neighborhood around $x_0$ for all $x_0$ in the function's domain. But how do I know this?

Yet again feels circular... just working through these two examples makes me wonder, how do I know these series converge to the function itself? How do I know they're equivalent representations? Am I supposed to avoid the self-referencing?

I'm looking for some context behind how to make sense of these two Taylor series and how I am supposed to know that $e$ is analytic or that I can use $S(x)$ for any $x$ I want even if I only computed it for the neighborhood around $x=0$.


Solution 1:

If $x\in\mathbb R$, you want to know why is it true that$$e^x=\sum_{k=0}^\infty\frac{x^n}{n!}.\tag1$$Well, that's because, by Taylor's theorem, you have$$(\forall N\in\mathbb{N}):e^x-\sum_{n=0}^N\frac{x^n}{n!}=\frac{e^y}{(N+1)!}x^{N+1}$$for some $y$ between $0$ and $x$; this comes from the Lagrange form of the remainder. But then$$\left|e^x-\sum_{n=0}^N\frac{x^n}{n!}\right|\leqslant\begin{cases}\frac{e^x}{(N+1)!}x^{N+1}&\text{ if }x>0\\\frac1{(N+1)!}x^{N+1}&\text{ otherwise}\end{cases}$$and therefore$$\lim_{N\to\infty}\left|e^x-\sum_{n=0}^N\frac{x^n}{n!}\right|=0.$$In other words, $(1)$ holds.

Solution 2:

Once you learn complex analysis, these things become clear.

Here is a way to argue that these series are convergent to $e^x$. Define $$S(x) = \sum_{k=0}^{\infty} \frac{e^a(x-a)^k}{k!}$$

The ratio/root test gives that the ratio of convergence is $\infty$. Therefore, $S(x)$ is a function which is differentiable everywhere.

A simple computation shows that $S'(x)=S(x)$ and that $S(a)=e^a$. The next lemma solves the problem.

Lemma If $f(x)$ is a function which is differentiable everywhere, and $f'(x)=f(x)$ then $f(x)=Ce^x$ for some constant $C$.

Proof: Let $g(x)=\frac{f(x)}{e^x}$. Then $g'(x)=0$, and therefore $g(x)=C$ for some constant $C$.

Now, the Lemma gives that $S(x)=Ce^x$ for some $C$ and $S(a)=e^a$ gives $C=1$.

P.S. A more general way of proving this type of results is via Lagrange estimate of the remainder. This answer exploits the properties of $e^x$.

Solution 3:

I think you need to get familiar with key definitions and theorems of calculus. The fact that a function can be represented by its Taylor series under certain circumstances is covered by Taylor's theorem and one of its forms is this:

Taylor's Theorem: Let $n, p$ be positive integers such that $1\leq p\leq n$ and $a, h$ be real numbers with $h>0$. If $f:[a, a+h] \to\mathbb{R} $ is a function such that it's $n$'th derivative $f^{(n)}$ exists on $(a, a+h) $ and its $(n-1)$'th derivative $f^{(n-1)}$ is continuous on $[a, a+h] $ then there is some number $\theta \in(0, 1) $ such that $$f(a+h) =f(a) +hf'(a) +\frac{h^2}{2!}f''(a)+\dots+\frac{h^{n-1}}{(n-1)!}f^{(n-1)}{a}+R_{n}\tag{1} $$ where $$R_{n} = \frac{(1 - \theta)^{n - p}h^{n}f^{(n)}(a + \theta h)}{p(n - 1)!}\tag{2}$$ The theorem works trivially if $h=0$ and if $h$ is negative we just need to consider the intervals like $[a+h, a] $.

Next from your question I guess that the definition of $e^{x} $ being used is that $e^x$ is its own derivative and takes value $1$ at $0$. Without going into the details of this definition let's understand that the definition implies that it possesses derivatives of all orders and every derivative is $e^x$.

Now it is time to apply the Taylor's theorem for $f(x) =e^x$. We chose $a=0, p=n$ and replace symbol $h$ by $x$ and note that $f^{(n)} (0)=1$ for all $n$. We then obtain $$e^x=f(x) =1+x+\frac{x^2}{2!}+\dots+\frac{x^{n-1}}{(n-1)!}+R_n\tag{3}$$ where $R_n=x^ne^{\theta x} /n! $ for some $\theta\in(0,1)$. It is easy to show that given any $x$ we have $\lim_{n\to\infty} R_n=0$ and thus taking limit as $n\to\infty$ in equation $(3)$ we get the identity $$e^x=1+x+\frac{x^2}{2!}+\dots=\sum_{n=0}^{\infty} \frac{x^n} {n!} \tag{4}$$ To summarize one can figure out the Taylor series of a function by calculating its derivatives at a certain point, but whether the function is represented by its Taylor series depends crucially on the behavior of the remainder $R_n$ as $n\to\infty$.

The choice $a=0$ is arbitrary and using generic $a$ we get the identity (as before) $$e^x=e^a\sum_{n=0}^{\infty}\frac{(x-a)^n}{n!}\tag{5}$$ Using $(4)$ we then get the fundamental identity $$e^x=e^ae^{x-a} $$ By the way this can be also be established directly using the definition of $e^x$ (being its own derivative and taking value $1$ at $0$). So everything works out fine.

You should also study the proof of Taylor’s theorem to understand the above arguments completely. The statement of a theorem gives us a certain guarantee but it is the proof of the theorem which makes us believe that the guarantee provided is genuine.


There is another approach to prove identity $(4)$ for all real values of $x$ which just uses the definition of $e^x$ and avoids the slightly complicated Taylor's theorem. We first show that the solution to the differential equation $$f'(x) =f(x), f(0)=1\tag{6}$$ is unique and then show that $$S(x) =\sum_{n=0}^{\infty}\frac {x^n} {n!}\tag{7} $$ is a solution to $(6)$. The proof requires some not so difficult results from the theory of infinite series.

First the uniqueness of the solution to $(6)$ is established. Suppose we have two solutions $f, g$ then $F(x) =f(x) - g(x) $ satisfies $F'(x) =F(x), F(0)=0$. We show that $F(x) =0$ for all $x$. If for some $a$ we have $F(a) \neq 0$ then we consider $$\phi(x) =F(a+x) F(a-x) $$ and clearly $$\phi'(x) =F'(a+x) F(a-x) - F(a+x) F'(a-x) $$ which is equal to $$F(a+x) F(a-x) - F(a+x) F(a-x) =0$$ and therefore $\phi(x) $ is constant and then $\phi(x) =\phi(0)=F(a)F(a)>0$. But $\phi(a) =F(2a)F(0)=0$ and we get a contradiction. Thus $F(x) =0$ for all $x$ and equation $(6)$ does not possess two distinct solutions.

Now we need to show that $S(x)$ defined in $(7)$ satisfies $(6)$. First we can use ratio test to conclude that the series in $(7)$ is convergent for all $x$ and thus the definition $(7)$ makes sense. Clearly $S(0)=1$ and our real challenge is to show that $S'(x) =S(x) $. We can use Cauchy product formula to multiply two series and conclude (via binomial theorem) that $$S(a) S(b) =S(a+b) \tag{8}$$ for all values of $a, b$. Next we establish the fundamental limit $$\lim_{x\to 0}\frac {S(x) - 1}{x}=1\tag{9}$$ We have for $0<|x|<1$ $$\left|\frac{S(x) - 1}{x}-1\right|=\left|\sum_{n=2}^{\infty}\frac{x^{n-1}}{n!}\right|$$ and this does not exceed $$\frac{|x|} {2!}+\frac{|x|^2}{3!}+\dots$$ The above expression clearly does not exceed $$\frac {|x|} {2}+\frac{|x|^2}{4}+\frac{|x|^3}{8}+\dots=\frac{|x|}{2-|x|}$$ and this tends to $0$ with $x$ so that $(9)$ is established.

We now have $$S'(x) =\lim_{h\to 0}\frac{S(x+h)-S(x)}{h}=\lim_{h\to 0}\frac {S(x) S(h) - S(x)} {h} \\ =\lim_{h\to 0}S(x)\cdot\frac{S(h)-1}{h}=S(x)$$ using equations $(8),(9)$. Thus we have proved that $S(x) $ is the unique solution to $(6)$ and by definition of $e^x$ it equals $e^x$.

Solution 4:

It can be proved by Lagrange remainder which represent the difference between the function and it's Taylor polynomial:

$$R_n(x)=|f(x)-P_n(x)|$$

See here for a related OP.