Why isn't $\lim \limits_{x\to\infty}\left(1+\frac{1}{x}\right)^{x}$ equal to $1$?

Given $\lim \limits_{x\to\infty}(1+\frac{1}{x})^{x}$, why can't you reduce it to $\lim \limits_{x\to\infty}(1+0)^{x}$, making the result "$1$"? Obviously, it's wrong, as the true value is $e$. Is it because the $\frac{1}{x}$ is still something even though it's really small? Then why is $$\lim_{x\to\infty}\left(\frac{1}{x}\right) = 0\text{?}$$

What is the proper way of calculating the limit in this case?


Solution 1:

Note that your argument, if valid, would work with multiplication instead of exponentiation: because $\lim_{x\to\infty}(\frac1x)=0$ one gets $\lim_{x\to\infty}(\frac1x)x=\lim_{x\to\infty}0x=0$. This is of course very wrong, but uses exactly the same reasoning as you did. Maybe this will help you find your error.

Solution 2:

Let $f(x,y)=(1+y)^x$. True enough, $f(x,0)=1$ for every $x$ but this is irrelevant to the limit of $f(x,1/x)$ when $x\to+\infty$. Note that one could also consider $f(\infty,1/x)=\infty$ for every positive $x$, as irrelevant as the preceding value $1$.

To compute the actual limit of $f(x,1/x)$, several approaches exist. One is to look at $\log f(x,1/x)=x\log(1+1/x)$ and to remember that $\log(1+u)\sim u$ when $u\to0$ hence $\log f(x,1/x)\to1$ and $f(x,1/x)\to\mathrm e$.

To see why $\log(1+u)\sim u$ when $u\to0$, consider $g(u)=\log(1+u)$ and note that $g(0)=0$ while $g'(u)=1/(1+u)$ hence $g'(0)=1$ and the Taylor expansion $g(u)=g(0)+g'(0)u+o(u)$ yields the result.

Finally, note that, for every fixed $c$, $f(x,c/x)=(1+c/x)^x\to\mathrm e^c$ hence one can realize every positive limit $\mathrm e^c$ by considering the regimes $x\to+\infty$, $xy\to c$. The limit $1$ is realized if $x\to+\infty$ while $xy\to0$ and the limit $+\infty$ if $x\to+\infty$ while $xy\to+\infty$.

Solution 3:

To give a shorter version of what other people have said, you have shown that $$\displaystyle\lim_{x\to\infty}\lim_{y\to\infty} \left(1+\frac 1y\right)^x = 1.$$ In other words, you split your limit into two separate limits. While this works often enough that people tend to believe it is always true, it in fact fails for many limits, as this example demonstrates!

Solution 4:

In this answer, it is shown that $$ a_k=\left(1+\frac1k\right)^k $$ is an increasing sequence and $$ b_k=\left(1+\frac1k\right)^{k+1} $$ is a decreasing sequence. Thus, for any $k$, $$ a_k\le\lim_{n\to\infty}\left(1+\frac1n\right)^n\le b_k $$ Since $a_1=2$, we have that $$ \lim_{n\to\infty}\left(1+\frac1n\right)^n\ge2 $$


In the erroneous argument, the two $n$s are decoupled; the $n$ in the denominator is sent to $\infty$ first, before the $n$ in the exponent. In the limit, they both go to $\infty$ together. The decrease caused by the $n$ in the denominator is more than cancelled by the increase caused by the $n$ in the exponent (as I noted above, $a_n$ is an increasing sequence).