Multiplying Taylor series and composition

For product: Suppose that the Taylor series for $f(x)$ about $x=0$ is $a_0+a_1x+a_2x^2 +a_3x^3+\cdots$, and converges if $|x|<A_f$. Suppose also that the series for $g(x)$ is $b_0+b_1x+b_2x^2+b_3x^3 +\cdots$, and converges if $|x|<A_g$. Then the series for $f(x)g(x)$ is in principle not hard to compute, and converges at least when $|x|<\min(A_f,A_g)$.

Just do what comes naturally, and multiply the two series as if they were long polynomials. Explicitly, the series for $f(x)g(x)$ is $$c_0+c_1x+c_2x^2+c_3x^3+\cdots, \quad\text{where}\quad c_n=\sum_{i=0}^n a_i b_{n-i}.\qquad\qquad(\ast)$$ In some cases, we may be able to find a closed form for the $c_n$. The series $\sum c_nx^n$ is called the convolution of the two series $\sum a_nx^n$ and $\sum b_n x^n$. The convolution operation is of great importance in many areas of mathematics.

Often, for approximation purposes, we only want to find the first few terms of the power series expansion for $f(x)g(x)$. Then the computations are easy.

We said that the series for $f(x)g(x)$ converges at least when $|x|<\min(A_f, A_g)$. But the convergence behaviour may be far better than that. For example let $f(x)=1-x$ and $g(x)=\frac{1}{1-x}$. The series for $f(x)$ is simply $1-x$, and converges everywhere. The series for $g(x)$ only converges when $|x|<1$. But the series for $f(x)g(x)$ is simply $1$, and converges everywhere.

Only minor modification of $(\ast)$ is needed when we know the Taylor expansions of $f(x)$ and $g(x)$ in powers of $x-a$ instead of powers of $x$: just replace $x$ everywhere by $x-a$.

For composition: Again, we do what comes naturally, and simply substitute. Well, as pointed out by Robert Israel, the process is not quite that simple. The series for $f(x)$ usually tells us the behaviour of $f(x)$ when $x$ is near $0$, and may not be valid if $x$ is some distance from $0$. So substitution will work when for $x$ near $0$, $g(x)$ is near $0$. In terms of the Taylor series for $g(x)$, it means that the constant term in the expansion of $g(x)$ should be $0$. Thus $g(x)=\arctan(x)$ is generally OK, but $g(x)=1+x^2$ is not. If we attempt substitute $1+x^2$ for $u$ in the usual series for $\frac{1}{1-u}$, we definitely will not obtain the power series expansion for $\frac{-1}{x^2}$, since this function does not have a power series expansion about $x=0$.

Unfortunately, the substitution process, even when it valid, can be tedious. However, since we only look at $g(x)$ whose series has $0$ constant term, the expansion of $(g(x))^k$ has no powers of $x$ less than $x^k$. So finding the first few terms of the power series expansion of $f(g(x))$ is quite easy.

There are some simple but important cases. For example, if $f(u)=\frac{1}{1-u}$, then the series expansion of $f(u)$ is $$1+u+u^2+u^3+\cdots.$$ We want the series expansion for $\frac{1}{1+x^2}$. So our function is $f(g(x))$, where $g(x)=-x^2$. Just substitute $-x^2$ every time that you see $u$. We get $$\frac{1}{1+x^2}=1-x^2+x^4-x^6+\cdots.$$ The series for $\arctan(x)$ is usually obtained in this way. Find the series for $\frac{1}{1+x^2}$ as we just did, and integrate term by term.


You can check some theory on two matters:

A Cauchy product rule:

Let $$A = \sum {{a_k}} $$ and $$B = \sum {{b_k}} $$ Then$$A \cdot B = C = \sum {{c_k}} $$ where $${c_k} = \sum\limits_{n = 0}^k {{a_n}{b_{k - n}}} $$

(the last expression is a discrete convolution)

The theorem is valid for finite sums, and for series if one series converge and the other converges absolutely.

B For the second matter, the composition, you should consider the properties of the Taylor series. Since there is a broad scope of functions we can compose, you should always pay attention to continuity among other issues. However, in the simplest cases, you can compose a Taylor polynomial with another polynomial, or elementary functions. For example, you have that

$${e^{\sin x}} = \sum {\frac{{{{\sin }^k}x}}{{k!}}} $$ or

$$\frac{1}{{1 - {{\sin }^2}x}} - 1 = {\tan ^2}x = \sum\limits_{k = 1}^\infty {{{\sin }^{2k}}x} $$

converges quite well for a low amount of terms.