How does one prove L'Hôpital's rule?

L'Hôpital's rule can be stated as follows:

Let $f, g$ be differentiable real functions defined on a deleted one-sided neighbourhood$^{(1)}$ of $a$, where $a$ can be any real number or $\pm \infty$. Suppose that both $f,g$ converge to $0$ or that both $f,g$ converge to $+\infty$ as $x \to a^{\pm}$ ($\pm$ depending on the side of the deleted neighbourhood). If $$\frac{f'(x)}{g'(x)} \to L,$$ then $$\frac{f(x)}{g(x)} \to L,$$ where $L$ can be any real number or $\pm \infty$.

This is an ubiquitous tool for computations of limits, and some books avoid proving it or just prove it in some special cases. Since we don't seem to have a consistent reference for its statement and proof in MathSE and it is a theorem which is often misapplied (see here for an example), it seems valuable to have a question which could serve as such a reference. This is an attempt at that.

$^{(1)}$E.g., if $a=1$, then $(1,3)$ is such a neighbourhood.


Solution 1:

For the sake of avoiding clutter, we suppose without loss of generality that $x \to a^+$ and establish the notation that a "neighbourhood" refers to a deleted one-sided neighbourhood.

Pick $\epsilon>0$. By hypothesis, there exists a neighbourhood $U$ of $a$ such that $g'(x) \neq 0$ for every $x \in U$ and $$L-\epsilon<\frac{f'(x)}{g'(x)}<L+\epsilon \tag{1}$$ for every $x \in U$. By the Cauchy mean value theorem, it follows that for every $\alpha, \beta \in U$ with $\beta>\alpha$, $$L-\epsilon<\frac{f(\beta)-f(\alpha)}{g(\beta)-g(\alpha)}<L+\epsilon. \tag{2}$$

We divide the proof now in order to address our two main cases.

Case 1: $f,g \to 0$ as $x \to a^+$.

By letting $\alpha \to a$ in $(2)$, we have that $$L-\epsilon \leq \frac{f(\beta)}{g(\beta)}\leq L+\epsilon.$$ Since this holds for every $\beta \in U$, we have the result in this case.

OBS: Note that $g(\beta)\neq 0$ due to the mean value theorem applied to $g$ extended continuously to $a$. In fact, one could also apply the Cauchy mean value theorem to the extensions of $f$ and $g$ in order to prove this case by considering $\frac{f(\beta)-f(0)}{g(\beta)-g(0)}$ directly. Of course, this would not work in the case where $f,g \to +\infty$.

Case 2: $f,g \to +\infty$ as $x \to a^+$.

We can rewrite $(2)$ as $$L-\epsilon<\frac{f(\beta)}{g(\beta)-g(\alpha)} + \frac{g(\alpha)}{g(\alpha)-g(\beta)} \cdot\frac{f(\alpha)}{g(\alpha)}<L+\epsilon.$$ Taking the $\limsup$ and $\liminf$ as $\alpha \to a$ together with the fact that $g \to +\infty$ yields $$L-\epsilon \leq \liminf_{\alpha \to a}\frac{f(\alpha)}{g(\alpha)} \leq \limsup_{\alpha \to a}\frac{f(\alpha)}{g(\alpha)} \leq L+\epsilon.$$ Since this holds for every $\epsilon>0$, we have that $\liminf_{\alpha \to a}\frac{f(\alpha)}{g(\alpha)} = \limsup_{\alpha \to a}\frac{f(\alpha)}{g(\alpha)}=L$ and the result follows.


Some observations:

  • It should also be clear that if $L = +\infty$ (resp. $-\infty$), then these proofs can be easily adapted by changing "pick $\epsilon>0$" to "pick $K \in \mathbb{R}$" and changing inequality $(1)$ to $\frac{f'(x)}{g'(x)} >K$ (resp. $\frac{f'(x)}{g'(x)} < K$), while also making the obvious following changes.
  • As a mild curiosity (which is not so deep after some inspection), note that in the case of $f,g \to +\infty$, the assumption that $f \to +\infty$ is actually not necessary. It suffices to assume that $g \to +\infty$. But stating the theorem without assuming $f \to +\infty$ may be confusing to students that see this much more frequently in the context of the so-called "indeterminate forms".
  • The passage involving the $\limsup$ and $\liminf$ may be somewhat obscure. First of all, we are adopting the following definitions: $$\limsup_{x \to a} = \inf_{\substack{\text{$U$ del.} \\ \text{nbhd. of $x$}}} \sup_{x \in U} f(x), \quad \liminf_{x \to a} = \sup_{\substack{\text{$U$ del.} \\ \text{nbhd. of $x$}}} \inf_{x \in U} f(x).$$ We could also solve that part sequentially by taking $x_n \to a$ and using the $\limsup$ and $\liminf$ of sequences, establishing that the limit is the same for every sequence $x_n \to a$. It is a matter of preference.

    Then, we are using are the following facts:

    1) If $\lim_{x \to a} h(x) = M$, then $$\limsup (h(x)+j(x)) = M +\limsup j(x) $$ and $$\liminf (h(x)+j(x)) = M +\liminf j(x) .$$ 2) If $\lim_{x \to a} h(x) = c >0$, then $$\limsup (h(x) j(x)) = c \cdot\limsup j(x)$$ and $$\liminf (h(x) j(x)) = c \cdot\liminf j(x).$$ 3) If $h(x) \leq j(x)$, then $$\limsup h(x) \leq \limsup j(x) $$ and $$\liminf h(x) \leq \liminf j(x) .$$ 4) $\liminf h(x) \leq \limsup h(x)$ and, if both coincide, then $\lim h(x)$ exists and is equal to both.

Solution 2:

By definition, $f'(a) = \lim_\limits{x\to a} \frac {f(x) - f(a)}{x-a}$

if $f'(a), g'(a)$ exist and $g'(a) \ne 0$

$\frac {f'(a)}{g'(a)} = \lim_\limits{x\to a} \dfrac {\frac {f(x) - f(a)}{x-a}}{\frac {g(x) - g(a)}{x-a}} = \lim_\limits{x\to a}\frac {f(x) - f(a)}{g(x) - g(a)}$

if $f(a), g(a) = 0$

$\frac {f'(a)}{g'(a)} = \lim_\limits{x\to a}\frac {f(x) }{g(x)}$