Why is the notion of analytic function so important?

Analytic functions have several nice properties, including but not limited to:

  1. They are $C^\infty$ functions.
  2. If, near $x_0$, we have$$f(x)=a_0+a_1(x-x_0)+a_2(x-x_0)^2+a_3(x-x_0)^3+\cdots,$$then$$f'(x)=a_1+2a_2(x-x_0)+3a_3(x-x_0)^2+4a_4(x-x_0)^3+\cdots$$and you can start all over again. That is, you can differentiate them as if they were polynomials.
  3. The fact that you can express them locally as sums of power series allows you to compute fast approximate values of the function.
  4. When the domain is connected, the whole function $f$ becomes determined by its behaviour in a very small region. For instance, if $f\colon\mathbb{R}\longrightarrow\mathbb R$ is analytic and you know the sequence $\left(f\left(\frac1n\right)\right)_{n\in\mathbb N}$, then this knowledge completely determines the whole function (the identity theorem).

A serious issue when dealing with functions is the ability to evaluate them. The basic tools we have at disposal for function evaluation are the four arithmetic operations.

Hence polynomials (and to a lesser extent rational fractions) are of utmost importance. Taylor development bridges functions to polynomials and their generalization, entire series. In addition, they enjoy numerous important properties, such as continuity, differentiability, smoothness... and are amenable to analytic processing.


Excellent question! I'm glad you asked!

There are lots of reasons, but I would say the most fundamental are the following:

1. Because Taylor series approximate using ONLY basic arithmetic

I wish someone told me this back in school. It's why we study polynomials and Taylor series.

The fundamental mathematical functions we really understand deeply are $+$, $-$, $\times$, $\div$... to me, it's fair to say the study of polynomials is really the study of "what can we do with basic arithmetic?"

So when you prove that a function can be approximated by a Taylor series, what you're really saying is that you can evaluate that function to a desired precision via basic arithmetic.

If this doesn't sound impressive, it's probably because someone else has already done the work for you so you don't have to. ;) To elaborate:

You probably type in sin(sqrt(2)) into a calculator and take it for granted that it gives you back an answer (and notice it's an approximate one!) without ever knowing how it actually does this. Well, there isn't a magic sin and sqrt circuit in your calculator. Everything is done via a sequence of $+$, $-$, $\times$, $\div$ operations, because those are the only things it knows how to do.

So how does it know which exact sequence of basic arithmetic operations to use? Well, frequently, someone has used Taylor series to derive the steps needed to approximate the function you want (see e.g. Newton's method). You might not have to do this if all you're doing is punching things into a calculator, because someone else has already done it for you.

In other words: Taylor series are the basic building blocks of fundamental functions.

But that's not all. There's also another important aspect to this:

2. Taylor series allow function composition using ONLY basic arithmetic

To understand this part, consider that the Taylor series for $f(x) = g(h(x))$ is pretty easy to evaluate: you just differentiate via the chain rule ($f'(x) = g'(h(x)) h'(x)$, etc.) and now you have obtained the Taylor series for $f$ from the derivatives of $g$ and $h$ using ONLY basic arithmetic.

In other words, when $f$ is analytic and you've "solved" your problem for $g$ and $h$, you've "solved" it for $f$ too! (You can think of "solving" here to mean that we can evaluate something in terms of its individual building blocks that we already know how to evaluate.)

If composability seems like a trivial thing, well, it is most definitely not!! There are lots of other approximations for which composition only makes your life harder! Fourier series are one example. If you try to compose them arbitrarily (say, $\sin e^x$) you'll quickly run into a brick wall.

So, in other words, Taylor series also provide a "glue" for these building blocks.

That's a pretty good deal!!


Being analytic, and especially being complex-analytic, is a really useful property to have, because

  1. It's very restrictive. Complex-analytic functions integrate to zero around closed contours, are constant if bounded and analytic throughout $\mathbb{C}$ (or if their absolute value has a local maximum inside a domain), preserve angles locally (they are conformal), and have isolated zeros. Analyticity is also preserved in uniform limits.
  2. Most of the functions we obtain from basic algebraic operations, as well as the elementary transcendental functions, (and, indeed, solutions to linear differential equations), are analytic at almost every point of their domain, so the surprising restrictiveness of being an analytic function does not stop the class of functions that are analytic from containing many interesting and useful examples. Proving something about analytic functions tells you a something about all of these functions.

Being real-analytic is rather less exciting (in particular, there is no notion of conformality and its related phenomena). Most properties of real-analytic functions can be deduced from restricting local properties of complex-analytic ones anyway, due to this characterisation. So we still have isolation of zeros, and various other properties, but nowhere near as much (and uniform limits are no longer analytic).


We like functions that can be expressed by Taylor series, because they are really well behaved. Doing analysis with analytic functions is simply easier than with more general functions.

It might be interesting to consider the state of affairs two centuries ago: the following quote by Niels Henrik Abel is one of my favorite

... There are very few theorems in advanced analysis which have been demonstrated in a logically tenable manner. Everyone one finds this miserable way of concluding from the special to the general, and it is extremely peculiar that such a procedure has led to so few of the so-called paradoxes. It is really interesting to seek the cause.

To my mind, it lies in the fact that in analysis, one is largely occupied with functions which can be expressed by powers. As soon as other functions enter — this, however, is not often the case — then it does not work any more and a number of connected, incorrect theorems arise from the false conclusions. ...

(as quoted by Niels Henrik Abel: Mathematician Extraordinary)

In short, mathematicians of the 18th and early 19th century proved things about analytic functions because:

  • those are the functions that come up with doing analysis
  • those are the functions they knew how to prove things about

And, in fact, it wasn't even really recognized that the analytic functions were special.