How to check the real analyticity of a function?
I recently learnt Taylor series in my class. I would like to know how is to possible to distinguish whether a function is real-analytic or not. First thing to check is if it is smooth. But how can I know whether the taylor series converges to the function?
For example: $f(x)=\frac{1}{1-x}, x\in(0,1)$ has $n^{th}$ degree taylor polynomial $\sum_{k=0}^n x^k$. In this case, I understand that $f$ is analytic in its domain since the geometric series $\sum_{k=0}^\infty x^k$ for $x\in(0,1)$ converges to $\frac{1}{1-x}$.
In general, what is the trick? For example, how to know if $\sin(x),\cos(x)$ are analytic?
Solution 1:
This is a difficult question in general. Ideally, to show that $f$ is analytic at the origin, you show that in a suitable neighborhood of $0$, the error of the $n$-th Taylor polynomial approaches $0$ as $n\to\infty$.
For example, for $f(x)=\sin(x)$, any derivative of $f(x)$ is one of $\sin(x)$, $\cos(x)$, $-\sin(x)$, or $-\cos(x)$, and the error given by the $n$-th Taylor polynomial takes the form $\displaystyle \frac{f^{(n+1)}(\alpha)}{(n+1)!}x^{n+1}$ for some $\alpha$ between $0$ and $x$ (that depends on $n$). In absolute value, this is bounded by $\displaystyle \frac{|x|^{n+1}}{(n+1)!}$, that (on any bounded set) approaches $0$ uniformly as $n\to\infty$. This shows that the Taylor series for $f(x)=\sin(x)$ converges to $\sin(x)$, in any neighborhood of $0$ (and therefore everywhere). The same applies to $f(x)=\cos(x)$. A similar argument holds for a variety of functions, including $f(x)=e^x$.
And there are general theorems; for instance, any solution of a linear homogeneous ordinary differential equation with analytic coefficients is analytic (in a small neighborhood), as the differential equation can be used to establish bounds on the error term. The case of sine is an example, as $\sin(x)$ is a solution of $y''=-y$.
But the question is difficult in general. For example, a uniformly convergent series of analytic functions needs not be analytic. For instance, consider Weierstrass function, which in fact is nowhere differentiable.
Given a smooth function $f$ and a point $a$ in its domain, it may be that the formal Taylor series associated to $f$ at $a$ does not converge anywhere. Clearly in that case $f$ is not analytic at $a$. But it may be that the formal Taylor series associated to $a$ converges in an interval, but it does not converge to $f$ (identically) in any such interval. Then, again, $f$ is not analytic, but this may be harder to establish. For a short survey of $C^\infty$ nowhere analytic functions, by Dave L Renfro, see here.
In practice, for many analytic functions $f$, analyticity is established not by studying the rate of decay of the error terms, but by "inheritance". For example, $f$ could be the series of term by term derivatives of an analytic function, or its term by term antiderivative, or the result of composing two analytic functions, etc.
Solution 2:
I agree with Betty Mock's thesis that complex analytic functions are usually easier to deal with than real analytic functions, but I don't think that entire functions (e.g. $e^z$, $\cos z$, $\sin z$) are a good example of that: the real-analytic story is the same as the complex-analytic story.
Looking around, I found this previous math.SE question which asks why if a real function $f$ is analytic at $a \in \mathbb{R}$ and $f(a) \neq 0$, then $\frac{1}{f}$ is analytic at $a$. As the answers indicate, if you use complex analysis then it's just a matter of adapting the differentiability of the reciprocal to functions of a complex variable (no problem). However, if you insist on showing directly that the Taylor series of $\frac{1}{f}$ has a positive radius of convergence at $a$....then this really is a pain, as several of the answers (including one due to me, where I reference a book on real analytic function theory but don't have the energy to reproduce the details) attest.
Already understanding why $f(x) = \frac{1}{x^2+1}$ is real analytic and that the radius of convergence of the Taylor series expansion at $a \in \mathbb{R}$ is precisely $\sqrt{1+a^2}$ is a quick pure-thought argument if you know the rudiments of complex analysis. But trying to show this formula for the radius of convergence directly from the Taylor series expansion...there will be some actual work there, it seems to me.
Solution 3:
Figuring out if a function is real analytic is a pain; figuring out whether a complex function is analytic is much easier. First, understand that a real function can be analytic on an interval, but not on the entire real line.
So what I try to do is consider f as a function of a complex variable in the neighborhood of the point say x = a in question. If f(z) (z complex) = f(x) when y = 0 (z = x + iy) then f(z) is an extension of f to a neighborhood of a. To show that f(z) is analytic you need only show that it has a derivative as a complex variable at a. If so then its Taylor's series will converge to f(z) in some neighborhood of a. f as a real function is analytic on the interval that this neighborhood covers.
This approach does show that sin and cos are analytic in the entire plane, and thus on the x-axis.