Proving limit of f(x) - Tnf(x) (Taylor) is zero, in multivariable calculus

So as we all probably know, for $ f : \mathbb{R} \to \mathbb{R}$, and $T_nf(x)$ the taylor polynomial at a given point a, we have $\lim\limits_{x \to a} \frac{f(x) - T_nf(x)}{(x-a)^n} = 0$.

Our professor proved this by induction on $n$, using the fact that $ T_n^{(j)}f = T_{n-j}f^{(j)}$, where $(j)$ is the j-th derivative, and then using L'Hospital rule we conclude the claim for n+1.

After learning some multivariable and vector calculus, we started to learn multivariable Taylor polynomials, which I understand, but we were given the following theorem w/o proof:

Let $U \subseteq \mathbb{R}^n$ be an open set, $ a \in U $, $ f:U \to \mathbb{R}^m$ such that $ f \in C^n(U) $, i.e.$f$ is differentiable $n$ times in $U$. then:

$\lim\limits_{x \to a} \frac{f(x) - T_nf(x)}{||x-a||^n} = 0$

or writing equivalently:

$\lim\limits_{h \to 0} \frac{f(a+h) - T_nf(h)}{||h||^n} = 0$

So at first glance this looks simple, use induction and only treat the denominator as norm and derivatives are the differentia. But as far as I can gather, L'Hospital doesnt exist in multivariable functions. How can I start my proof for this , without just using reduction for each parameter and pointing to the single dimension case?


Henri Cartan's Book Differential Calculus (which is out of print, but available under a different title) gives an extremely general version of Taylor's theorem (Theorem $5.6.3$) for maps between Banach spaces. It states the following (with modified notation and phrasing):

Taylor Expansion Theorem:

Let $V$ and $W$ be Banach spaces over the field $\Bbb{R}$, let $U$ be an open subset of $V$, and fix a point $a \in U$. Let $f:U \to W$ be a given function which is $n$ times differentiable at $a$ (in the Frechet differentiable sense). Define the Taylor polynomial $T_{n,f}:V \to W$ by \begin{equation} T_{n,f}(h) = f(a) + \dfrac{df_a(h)}{1!} + \dfrac{d^2f_a(h)^2}{2!} + \dots + \dfrac{d^nf_a(h)^n}{n!} \end{equation} Then, $f(a+h) - T_{n,f}(h) = o(\lVert h \rVert^n)$.

Explicitly, the claim is that for every $\varepsilon > 0$, there is a $\delta > 0$ such that for all $h \in V$, if $\lVert h \rVert < \delta$, then \begin{equation} \lVert f(a+h) - T_{n,f}(h) \rVert \leq \varepsilon \lVert h \rVert^{n}. \end{equation}

Before proving this, there are some details which you should take note of. In the Taylor polynomial above, each $d^pf_a$ is a symmetric multilinear map from $V^p$ into $W$, and $(h)^p$ is short-hand for the element $(h,\dots,h) \in V^p$.

The proof of the theorem is pretty similar to the one-dimensional case; we use induction on $n$, and in the induction step, we're going to use the fact that \begin{align} d(T_{n+1,f})_h = T_{n,df}(h) \tag{$*$} \end{align} In words, this says the derivative of the function $T_{n+1,f}: V \to W$ at the point $h$ equals the $n^{th}$ Taylor polynomial for the function $df: U \to L(V,W)$ evaluated at $h$.

It is worth noting that while $T_{n+1,f}(h)$ is an element of $W$, in the equation above, $T_{n,df}(h)$ is an element of $L(V,W)$, i.e it is a linear transformation from $V$ into $W$.

Assuming you can justify this, we're going to use the mean-value inequality to complete the proof (Theorem $3.3.2$ in the book), which I'll state for completeness.

Mean-Value Inequality in Banach Spaces:

Let $V$ and $W$ be Banach spaces over the field $\Bbb{R}$, let $U$ be an open subset of $V$, and let $f:U \to W$ be a given differentiable function. If there is a convex subset $C$ contained in $U$ (for instance, a ball), and a constant $k > 0$, such that for every $x \in C$, $\lVert df_x \rVert \leq k,$ then for any $x_1, x_2 \in C$, we have that \begin{equation} \lVert f(x_1) - f(x_2) \rVert \leq k \lVert x_1 - x_2 \rVert. \end{equation}


Now for the actual proof. The case $n = 1$ is true simply by definition of $f$ being differentiable at $a$. This completes the base case. We shall assume the statement is true for $n$, and prove it for $n+1$. Define the function $\phi$ by \begin{equation} \phi(h) = f(a+h) - T_{n+1,f}(h) \end{equation} Now, the differential of $\phi$ at $h$ is given by the formula \begin{align} d \phi_h &= df_{a+h} - d(T_{n+1,f})_h \\ &= df_{a+h} - T_{n,df}(h) \tag{by ($*$)} \end{align} (this is an equality of elements in $L(V,W)$).

Notice that $df: U \to L(V,W)$ is $n$-times differentiable at $a$, so we may apply our induction hypothesis to it. Doing so implies that \begin{equation} d\phi_h = df_{a+h} - T_{n,df}(h) = o(\lVert h \rVert^n) \end{equation} i.e for every $\varepsilon > 0$, there is a $\delta>0$ such that if $\lVert h\rVert< \delta$ then \begin{align} \lVert d\phi_h \rVert \leq \varepsilon \lVert h \rVert^n \end{align}

The mean-value inequality now implies that \begin{align} \lVert \phi(h) - \phi(0)\rVert \leq \varepsilon \lVert h \rVert^n \cdot \lVert h \rVert = \varepsilon \lVert h \rVert^{n+1} \end{align} Since $\phi(0) = 0$, we have shown that $\phi(h) = o(\lVert h\rVert^{n+1})$. This completes the inductive step for $n+1$. Hence, by induction, the theorem holds for every $n \in \Bbb{N}$.


If you choose $V = \Bbb{R}^n $ and $W = \Bbb{R}^m$, then we recover the special case you're interested in (note that assuming this from the beginning doesn't simplify any part of the proof). As you can see in the proof, the only thing we really used was induction and mean-value inequality. The rest of the proof is just about being comfortable with Frechet derivatives, especially higher order differentials, and knowing which space each object lives in, where something is being evaluated, etc.

The justification of $(*)$ is really just a straight forward computation, but you have to be comfortable with differentiation in Banach spaces. The book explains the process clearly, so if you get stuck, you should refer to it.

By the way, if you make the additional hypothesis that the $(n+1)^{th}$ differential $(d^{n+1}f)_a$ is bounded in a neighbourhood of $a$, then we can prove this theorem by looking at the explicit formula for the remainder term (either the integral form, or the Lagrange form). This is also covered in Henri Cartan's book, so I highly recommend you take a look at it!