Proof of $\lim_{n\to \infty} \sqrt[n]{n}=1$

Thomson et al. provide a proof that $\lim_{n\rightarrow \infty} \sqrt[n]{n}=1$ in this book (page 73). It has to do with using an inequality that relies on the binomial theorem: enter image description here

I have an alternative proof that I know (from elsewhere) as follows.


Proof.

\begin{align} \lim_{n\rightarrow \infty} \frac{ \log n}{n} = 0 \end{align}

Then using this, I can instead prove: \begin{align} \lim_{n\rightarrow \infty} \sqrt[n]{n} &= \lim_{n\rightarrow \infty} \exp{\frac{ \log n}{n}} \newline & = \exp{0} \newline & = 1 \end{align}


On the one hand, it seems like a valid proof to me. On the other hand, I know I should be careful with infinite sequences. The step I'm most unsure of is: \begin{align} \lim_{n\rightarrow \infty} \sqrt[n]{n} = \lim_{n\rightarrow \infty} \exp{\frac{ \log n}{n}} \end{align}

I know such an identity would hold for bounded $n$ but I'm not sure I can use this identity when $n\rightarrow \infty$.

Question:

If I am correct, then would there be any cases where I would be wrong? Specifically, given any sequence $x_n$, can I always assume: \begin{align} \lim_{n\rightarrow \infty} x_n = \lim_{n\rightarrow \infty} \exp(\log x_n) \end{align} Or are there sequences that invalidate that identity?


(Edited to expand the last question) given any sequence $x_n$, can I always assume: \begin{align} \lim_{n\rightarrow \infty} x_n &= \exp(\log \lim_{n\rightarrow \infty} x_n) \newline &= \exp(\lim_{n\rightarrow \infty} \log x_n) \newline &= \lim_{n\rightarrow \infty} \exp( \log x_n) \end{align} Or are there sequences that invalidate any of the above identities?

(Edited to repurpose this question). Please also feel free to add different proofs of $\lim_{n\rightarrow \infty} \sqrt[n]{n}=1$.


Here is one using $AM \ge GM$ to $1$ appearing $n-2$ times and $\sqrt{n}$ appearing twice.

$$\frac{1 + 1 + \dots + 1 + \sqrt{n} + \sqrt{n}}{n} \ge n^{1/n}$$

i.e

$$\frac{n - 2 + 2 \sqrt{n}}{n} \ge n^{1/n}$$

i.e.

$$ 1 - \frac{2}{n} + \frac{2}{\sqrt{n}} \ge n^{1/n} \ge 1$$

That the limit is $1$ follows.


Since $x \mapsto \log x$ is a continuous function, and since continuous functions respect limits: $$ \lim_{n \to \infty} f(g(n)) = f\left( \lim_{n \to \infty} g(n) \right), $$ for continuous functions $f$, (given that $\displaystyle\lim_{n \to \infty} g(n)$ exists), your proof is entirely correct. Specifically, $$ \log \left( \lim_{n \to \infty} \sqrt[n]{n} \right) = \lim_{n \to \infty} \frac{\log n}{n}, $$

and hence

$$ \lim_{n \to \infty} \sqrt[n]{n} = \exp \left[\log \left( \lim_{n \to \infty} \sqrt[n]{n} \right) \right] = \exp\left(\lim_{n \to \infty} \frac{\log n}{n} \right) = \exp(0) = 1. $$


Here's a two-line, completely elementary proof that uses only Bernoulli's inequality:

$$(1+n^{-1/2})^n \ge 1+n^{1/2} > n^{1/2}$$ so, raising to the $2/n$ power, $$ n^{1/n} < (1+n^{-1/2})^2 = 1 + 2 n^{-1/2} + 1/n < 1 + 3 n^{-1/2}.$$

I discovered this independently, and then found a very similar proof in Courant and Robbins' "What is Mathematics".


$\sqrt[n]{n}=\sqrt[n]{1\cdot\frac{2}{1}\cdot\frac{3}{2}\dots\cdot\frac{n-1}{n-2}\cdot\frac{n}{n-1}}$ so you have a sequence of geometric means of the sequence $a_{n}=\frac{n}{n-1}$. Therefore its limit is equal to $\lim_{n\to\infty}a_{n}=\lim_{n\to\infty}\frac{n}{n-1}=1$.