Bernoulli's representation of Euler's number, i.e $e=\lim \limits_{x\to \infty} \left(1+\frac{1}{x}\right)^x $ [duplicate]
Solution 1:
$$\mathrm {Log}\left(\displaystyle\lim_{x\rightarrow\infty} \left(1 + \frac{1}{x}\right)^{x}\right) = \displaystyle\lim_{x\rightarrow 0}\text{ } \mathrm {Log} \left((1 + x)^{\frac{1}{x}}\right) = \lim_{x\rightarrow 0} \frac{\mathrm {Log}(1+x)}x = \lim_{x\rightarrow 0} \text{ }\displaystyle\sum_{i=0}^{\infty} \frac{x^i}{i+1} = 1.$$
Solution 2:
It does rather matter how you want to define $e$ in the first place. One way to define $e$ is to prove that the sequence whose $n$-th term is $(1 + \frac{1}{n})^n$ is increasing, but bounded above, and therefore converges to its least upper bound, which may be defined as $e$. More generally, we may define $e^x$ as $\lim_{n \to \infty} (1 + \frac{x}{n})^n$ for any real $x$ ( and the limit always exists). Then it's easy to verify from this definition that $e^{x+y} = e^{x}.e^{y}$ for all $x,y \in \mathbb{R}$. With this approach the Bernoulli representation of $e$ is almost a non-issue.
The very definition $(1 + \frac{1}{x})^{x}$ for non-integral $x$ (as $\exp(x \log(1 + \frac{1}{x}))$), presupposes that $e$ (and the natural logarithm) have already been defined.
Another way to define the function $e^x$ from first principles, adopted, for example, in Spivak's "Calculus"), is as the inverse function of the logarithm, where $\log(x)$ is defined as $\int_{1}^{x}\frac{1}{t} dt$ for $x >0$. Then the fundamental theorem of Calculus gives $\log'(x) = \frac{1}{x}$ for $x >0$, and if we define the exponential function as the inverse of the logarithm function, it is its own derivative. Since this function is always positive, the exponential function is increasing everywhere. The mean value theorem tells us that $x\log(1 + \frac{1}{x}) = \frac{1}{\theta}$ for some $\theta \in (1,1+\frac{1}{x})$ when $x >0.$ As $x \to \infty$, we see that $\theta \to 1$. Since $e^{x}$ is differentiable everywhere, it is certainly continuous, so that as $x \to \infty$, $\exp(x \log(1 + \frac{1}{x})) \to \exp(1) = e.$
NOTE ADDED: Since the question has been rephrased taking $e = \sum_{i=0}^{\infty} \frac{1}{i!}$ after the above was written, I add that the second approach here does that, since the fact that the exponential function is its own derivative shows that its Maclaurin series is the expected $e^{x} = \sum_{n=0}^{\infty} \frac{x^n}{n!}$, and that this converges for all real $x$ using the standard form for the remainder in Taylor's theorem (as, eg, in Spivak's book).
Solution 3:
From the binomial theorem
$$\left(1+\frac{1}{n}\right)^n = \sum_{k=0}^n {n \choose k} \frac{1}{n^k} = \sum_{k=0}^n \frac{n}{n}\frac{n-1}{n}\frac{n-2}{n}\cdots\frac{n-k+1}{n}\frac{1}{k!}$$
but as $n \to \infty$, each term in the sum increases towards a limit of $\frac{1}{k!}$, and the number of terms to be summed increases so
$$\left(1+\frac{1}{n}\right)^n \to \sum_{k=0}^\infty \frac{1}{k!}.$$
Solution 4:
Well, the problem with $e$ is that there are many different ways of defining it. But this is another way.
Suppose the limit exists, and call it $L$.
$\log L = \lim x \log \left( \dfrac{x + 1}{x} \right) = \lim \dfrac{ \log \frac{x + 1}{x}}{\frac{1}{x}} = \lim \dfrac{ \frac{x}{x+1} \cdot (-x^{-2} \cdot (x + 1) + x^{-1})}{ -x^{-2} } = \; \;...$
$... \;= \lim \dfrac{ \frac{x}{x+1} \cdot \frac{-1}{x^2} }{\frac{-1}{x^2}} = 1$
So $L = e^1$.
For a different approach.