Proving Asymptotic Barrier - O notation for $\ln(n!)$ [duplicate]

I'm interested in the exact barrier of $$\ln(n!)= \Theta(n\ln(n))$$

and if it even exists.

This means, there is a $c_1, c_2$ , so that $$\ln(n!) \le \Theta(n\ln(n) c_1$$ $$\ln(n!) \ge \Theta(n\ln(n) c_2$$

or

$$0 \le \lim\limits_{n \to \infty}\frac{\ln(n!)}{n\ln(n)} \ge \infty$$

I could use de l'Hospital for the limit fraction, since it's $\frac{\infty}{\infty}$ but the derivate of $\ln(n!)$ causes problems. Also I don't know how I could prove the two statements above or how to prove them wrong. Any help is appreciated.


Solution 1:

As in this other question, I give two answers below -- the first one is simple, and applies if you do not care about getting the exact constant in the asymptotic (i.e., if a $\Theta(\cdot)$ result is enough). The second is a tad longer, but gives the exact leading term; and actually more (i.e., it will also yield the two or three next low-order terms, in this particular case).


Method 1: Upper and lower bounds. Note that $\ln(n!) = \sum_{k=1}^n \ln k$. You have $$ \sum_{k=\lfloor n/2\rfloor }^n \ln k \leq \sum_{k=1}^n \ln k \leq \sum_{k=1}^n \ln n = n \ln n $$ and $$ \sum_{k=\lfloor n/2\rfloor }^n \ln k \geq \frac{n}{2}\ln \left\lfloor\frac{n}{2}\right\rfloor \geq \frac{n}{2}\ln\left(\frac{n}{2} - 1\right) = \frac{n}{2}\left(\ln(n-2) - \ln 2\right) \operatorname*{\sim}_{n\to\infty} \frac{1}{2} n\ln n. $$ Putting it together, $$ \sum_{k=1}^n \ln k = \Theta\left( n \ln n \right) $$ and moreover you have, for any $\epsilon\in(0,1]$ and $n$ big enough, $$\frac{1-\epsilon}{2}n\ln n \leq \sum_{k=1}^n \ln k \leq n\ln n.$$


Method 2: Using integrals, often much easier to handle than sums.

Note that $\ln(n!) = \sum_{k=1}^n \ln k$. We will use the fact that $f=\ln$ is a non-decreasing function on $[1,\infty)$ to do a comparison series-integral. (I am using $f$ afterwards to highlight that this is a general technique)

For any $k$, we have that $$ f(k) \leq f(x) \leq f(k+1) $$ for all $x\in[k,k+1)$. Integrating the inequality on $[k,k+1)$, we get $$ f(k) = \int_k^{k+1} f(k) dx \leq \int_k^{k+1} f(x) dx \leq \int_k^{k+1} f(k+1) dx = f(k+1). $$ Summing this inequality for $k$ between $1$ and $n-1$, we get $$ \sum_{k=1}^{n-1} f(k) \leq \sum_{k=1}^{n-1} \int_k^{k+1} f(x) dx = \int_1^{n} f(x) dx \leq \sum_{k=1}^{n-1} f(k+1) = \sum_{k=2}^{n} f(k) $$ or, rearranging (and adding the missing terms $f(1)$ and $f(n)$ when needed to complete the sum), $$ \int_1^{n} f(x) dx + f(1) \leq \sum_{k=1}^{n} f(k) \leq \int_1^{n} f(x) dx + f(n). $$ Now, we want to use the squeeze theorem, so that this will only work if $f(n)$ and $f(1)$ are negligible in front of $\int_1^{n} f(x) dx$ (otherwise, the "squeezing" cannot happen). Fortunately, this is the case here.

More precisely, you have $$ \int_1^{n} f(x) dx = \int_1^{n} \ln x dx = \left[x\ln x -x\right]^n_1 = n\ln n - n + 1 $$ and $f(1)=0$, $f(n)=\ln n$. Therefore, you have $$ n\ln n - n + 1 \leq \sum_{k=1}^{n} \ln k \leq n\ln n - n + \ln n + 1 $$ that is $$ \sum_{k=1}^{n} \ln k = n\ln n - n + o(n). $$ If you are only interested in the leading term: $$ \sum_{k=1}^{n} \ln k \operatorname*{\sim}_{n\to\infty} n\ln n. $$

(also, by any means -- please, why L'Hospital?)