Taylor's theorem in Banach spaces

Let $f$ be a real function of a single real variable. Suppose that $f$ is $n$ times differentiable at some $x$, for some integer $n\geq 1$. Making no further assumptions, we have

$$ f(x+h) = f(x) + f'(x)h + \frac{f''(x)h^2}{2} + \ldots + \frac{ f^{(n)}(x)h^n}{n!} + o(h^n) $$

although this tells us nothing about the size of the error on any fixed neighbourhood of $x$ (further regularity assumptions are needed to get the traditional forms for the remainder). Wikipedia has a proof of this which relies on, of all things, L'Hôpital's rule for which a proof is provided as well.

Recently I've been working through the basic theorems of calculus in the Banach space setting. It's not terribly surprising and I have seen it written in a few places that the expansion

$$ f(x+h) = f(x) + f'(x)h + \frac{f''(x)h^2}{2} + \ldots + \frac{ f^{(n)}(x)h^n}{n!} + o(\|h\|^n) $$

is still valid when $f$ is a map of Banach spaces $X \to Y$. In this setting, the $k$th derivative $f^{(k)}(x)$ of $f$ at $x$ is a continuous, symmetric, $k$-linear map $X^k \to Y$ and $h^k$ is short-hand for $(h,\ldots,h) \in X^k$. I would like to know:

How do I prove this last expansion is valid?

I'm having trouble adapting the proof from the single variable case because I can't seem to prove an appropriate analogue of L'Hôpital's rule. An industry standard proof of L'Hôpital's rule depends on Cauchy's mean value theorem and I am not sure whether this admits a Banach space analogue either. In fact, I think I would also be curious to know:

Do Cauchy's mean value theorem or L'Hôpital's rule have natural generalizations in the context of maps between Banach spaces?

Thanks in advance for any replies.


Solution 1:

The classical mean value theorems, like Cauchy, fail when the range is a higher dimensional space. The key result in multidimensional analysis is the Mean Value Inequality: if $u,v\in X$, $f\colon X\rightarrowtail Y$ is differentiable on the open line segment $]u,v[$ and continuous on the closed segment $[u,v]$, then $$\|f(u)-f(v)\|\le\sup_{0<t<1}\|f'(u+tv)(u-v)\|.$$ Now the Taylor theorem is obviously true for $n=1$ by definition, and then induction is used. The key is that the Taylor polynomial of $f'$ is the derivative of the Taylor polynomial of $f$. Thus by the above inequality and the induction hypothesis, and that $f(x)=T(x)$, $$\frac{\|f(x+h)-T(x+h)\|}{\|h\|^{n+1}}\le\sup_{0<t<1}\frac{\|(f-T)'(x+th)(h)\|}{\|h\|^{n+1}}<\varepsilon,$$ when $\|h\|$ is sufficiently small.