Are the logarithms in number theory natural?

I find the frequent emergence of logarithms and even nested logarithms in number theory, especially the prime number counting business, somewhat unsettling. What is the reason for them?

Has it maybe to do with the series expansion of the logarithm? Or is there something inherently exponential in any of the relevant number distributions, like in complexity theory or combinatorical problems? I think maybe in how you can construct bigger integers out of smaller ones.


Solution 1:

If you are asking, why do you find it unsettling that logarithms occur in Number Theory, I'm afraid you will have to ask a psychiatrist.

If you are asking, why are there logarithms in Number Theory, consider the following naive effort to find the number of primes up to $N$:

There are $N$ integers up to $N$; $(1/2)N$ odd integers up to $N$; $(1/2)(2/3)N$ are prime to 2 and 3; $(1/2)(2/3)(4/5)N$ are prime to 2, 3, and 5; and so on. Continue this reasoning up to the largest prime $p$ not exceeding $\sqrt N$, and what's left should be the primes between $\sqrt N$ and $N$. So you have to estimate $(1/2)(2/3)(4/5)\cdots((p-1)/p)$, and in the limit as $N\to\infty$, that product looks something like $1/\log N$ (but not exactly - the naive argument needs a fair bit of sophisticated tweaking). It's the limiting process that lets logarithms into the mix.

Solution 2:

One reason that it occurs in analytic number theory has to do with the Riemann zeta function:

$$\zeta(s) = \displaystyle\sum_{n=1}^{\infty} \frac{1}{n^s},$$

where $s = \sigma + it$ is a complex number complex number, and for reasons of convergence, we require that $\sigma > 1$. It turns out that due to an argument by Euler, we have

$$\zeta(s) = \displaystyle\sum_{n=1}^{\infty} \frac{1}{n^s} = \displaystyle\prod_p \frac{1}{1-p^{-s}}.$$

(where the $\displaystyle\prod_p$ operation indicates taking a product over all prime numbers $p$ and $\displaystyle\sum_p$ indicates taking a sum over all primes $p$). Big products like that aren't as easy to use and manipulate as sums, so mathematicians will do just about anything to turn a product into a sum to make it easier to work with. You probably know these three facts about logarithms:

i) $\log(ab) = \log(a)+\log(b)$,

ii) $\log{\frac{a}{b}}=\log(a)-\log(b)$, and

iii) $\log(1)=0$.

We can exploit those three rules after taking a logarithm of both sides of the above equation and we get the following:

$$\begin{array}{ll} \log(\zeta(s)) &= \log \left( \displaystyle\prod_p \frac{1}{1-p^{-s}} \right) \\ &= \displaystyle\sum_p \log \left(\frac{1}{1-p^{-s}} \right) \\ &= \displaystyle\sum_p \left[ \log(1) - \log({1-p^{-s}}) \right] \\ &= -\displaystyle\sum_p \log({1-p^{-s}}). \end{array}$$

Without going into too much detail, after you get to this point, you apply a standard operation from calculus called a "derivative" which you then scrutinize with analysis.

This general technique of investigation leads directly to proving the prime number theorem, which says

$$\displaystyle\lim_{x \rightarrow \infty} \frac{\pi(x)}{\frac{x}{\log(x)}} = 1,$$

which can be stated roughly in English as "as you go farther down the number line, calculating $\frac{n}{\log(n)}$ will give you a pretty good estimate of how many primes there are up to that point".