A natural discovery of $e$ as $\lim_{n\to\infty}\left(1+\frac{1}{n}\right)^n$ - what is unrigorous here?

The gaps in the argument that need to be filled with rigor are the following.

  1. First, we need an airtight definition of just what $a^x$ means for at least a real number $a > 0$ and a real number $x$.
  2. Second, we need to rigorously prove that the limit $$\lim_{h \rightarrow 0} \frac{a^h - 1}{h}$$ exists.
  3. Third, we need to prove that the limit $$\lim_{n \rightarrow \infty} \left(1 + \frac{1}{n}\right)^n$$ exists.
  4. Finally, we need to insert the value of this limit, i.e. $e$ into the limit in (2) and show that the limit there is $1$, so that $\frac{d}{dx} e^x = e^x$.

Note here that because we do not have a different expression for the values of the limits above, we have to prove these limits exists as a fact in and of itself without finding their value. You may have done an $\epsilon$-$\delta$ proof before, but every such proof has involved that you already know the limit's value. That bit is important and that, together with (1), are where the real "real analysis" content lies here.


I'm not sure this answer completely to your question but i want to point out that the suspected step seems the following

$$\frac{a^{h_n}-1}{h_n}=1$$

which should be a limit as $n \to \infty$

$$\frac{a^{h_n}-1}{h_n}\to 1$$

or also

$$\frac{a^{h_n}-1}{h_n}= 1+o(1)$$

that is

$$a=\left(1+h_n+o(h_n)\right)^\frac1{h_n} \implies e=\lim_{n\to \infty} \left(1+\frac1n+o\left(\frac1{n}\right)\right)^\frac1{n}$$

which seems to be a more convincing way to obtain the result.


I believe another aproach would be more rigorous and that might be the reason why your teacher is implying this is not. Here you assumed that the form of the function is $a^x$.

Another approach is to use analytical functions, which are functions that can be given almost anywhere by their Taylor series. If you assume that there exists a function $\exp(x)$ which is it's own derivative, then it must necessarily be analytical, since $\forall x \in \mathbb{R}$ its Taylor series is convergent.

From this it follows that: $$\exp(x; x_0) = \sum_{k=0}^{\infty} \exp^{(k)}(x_0)\frac{(x-x_0)^k}{k!} = \exp(x_0) \sum_{k=0}^{\infty} \frac{(x-x_0)^k}{k!}$$ Without loss of generality, we can choose $\exp(0) = 1$ and use $x_0=0$, and we end up with the following definition of $\exp(x)$: $$\exp(x) = \sum_{k=0}^{\infty} \frac{x^k}{k!} $$ $$e := \exp(1)$$

Then you can show that $$\exp(1) = \sum_{k=0}^{\infty} \frac{1}{k!} = \big(1 + h_n)^{\frac{1}{h_n}}$$ for every $h_n, \lim_{n\to\infty} h_n = 0$


Maybe you are being asked to think about this in terms of sequences of functions. You can define the sequence of functions $$f_n: (0,\infty)\to\mathbb R,\; f_n(a) = \frac{a^{1/n}-1}{1/n}$$ which has limit $$f(a) = \lim_{n\to\infty} f_n(a) = \lim_{n\to\infty} \frac{a^{1/n}-1}{1/n}.$$ From your post, we have that $$\forall n\geq 1,\; f_n(a_n) = 1$$ and want to determine whether this implies $$f(\lim_n a_n) = 1.$$ One way you might proceed to think about this is to define the inverses $$f_n^{-1}(x) = \left(1+\frac{x}{n}\right)^n.$$ Our question may be rephrased as $$\text{Does } \lim_{n\to\infty} f_n^{-1}(1) = f^{-1}(1)?$$ Well, according to this other stackexchange post Convergence of a sequence of functions and their inverses, the answer is yes, if you can show that $f^{-1}$ is continuous.