I was just reading this question and answer: How will this equation imply PNT and it raised a whole new question:

Given that $\sum_{n\le x} \Lambda(n)=x+o(x)$, prove that $$\sum_{n\le x} \frac{\Lambda(n)}{n}=\log x-\gamma +o(1),$$ where gamma is the Euler constant.

The above question is for the bounty. The full version above has been bothering me. Gerry Myerson pointed out we can prove $\sum_{n\leq x } \frac{\Lambda(n)}{n} =\log x +O(1)$ from only chebyshev estimate. But this is not my question.

My Attempts: I tried partial summation I end up with something like $\sum_{n} \frac{\psi(n)-n}{n^2}$. for that to converge $o(n)$ isn't strong enough. To do it my way, I would need to assume $$\psi(x)-x = O\left(\frac{x}{(\log x)^{1+\epsilon}}\right)$$ so that it converges. (Otherwise it could be as big as $\log x$ which is no good)

Can we prove the above estimate using only the basic prime number theorem $\psi(x)-x=o(x)$? Why or why not? Thank you!

Please note that my question is about a subtlety with converge, and blunt partial summation doesn't seem to work.


A few years ago, I assigned something a little weaker, $$\sum_{n\le x}{\Lambda(n)\over n}=\log x+r(x){\rm\ with\ }|r(x)|\le2$$ and it turned out to be the hardest problem on the assignment. Here's the proof I eventually wrote up.

Let $s=[x]$. Note $$\sum_{n\le x}{\Lambda(n)\over n}=\sum_{n\le s}{\Lambda(n)\over n}$$ and $0\le\log x-\log s\lt1$. Now

$$\log s!=\sum_{m\le s}\log m=\sum_{p^r\le s}\log p\sum_{m\le s,p^r\mid m}1=\sum_{p^r\le s}\log p[s/p^r]=\sum_{n\le s}\Lambda(n)[s/n]$$

$$=\sum_{n\le s}\Lambda(n)\left({s\over n}-\left\lbrace{s\over n}\right\rbrace\right)=s\sum_{n\le s}{\Lambda(n)\over n}-\sum_{n\le s}\Lambda(n)\left\lbrace{s\over n}\right\rbrace$$

Now $$\sum_{n\le s}\Lambda(n)\left\lbrace{s\over n}\right\rbrace\le\sum_{n\le s}\Lambda(n)\le Cs$$ for some constant $C$ by the Prime Number Theorem (which is actually overkill, as one can prove without PNT that that last sum is bounded by $2s$ for all $s$). Also, comparing $\log s!=\sum\log n$ to $\int_1^s\log t\,dt$ we get $$\log s!=s\log s-s+b(s){\rm\ with\ }|b(s)|\lt\log s+1$$ (cf. Stirling's formula). Putting it together, $$\sum_{n\le x}{\Lambda(n)\over n}=(1/s)\log s!+(1/s)\sum_{n\le s}\Lambda(n)\left\lbrace{s\over n}\right\rbrace=\log s-1+C+b(s)/s=\log x+r(x)$$ with $|r(x)|\le2$ since $C\le2$.

EDIT: The stronger result is proved in the textbook by Bateman and Diamond, Analytic Number Theory, pages 100-102. First they prove that, with the usual definitions, $\psi(x)\sim x$ implies $M(x)=o(x)$, where $M(x)$ is the summatory function for the Möbius $\mu$-function. Then from the result on $M(x)$ they deduce the result you want. It's a bit long to write out as I'd have to explain the notation introduced earlier in the chapter.

MORE EDIT: It's also Proposition 3.4.4 of Jameson's textbook, The Prime Number Theorem, and it's proved on page 91 of Tenenbaum and Mendes France, The Prime Numbers and Their Distribution, in the Student Mathematical Library series of the American Math Society. Again, the proofs require too much previously developed material for me to attempt to write them out here.


If you add $\frac 1n$ to your $\Lambda(n)$, the estimate in the condition does not change because $H(x)\sim\log x = o(x)$.

But the desired conclusion changes by a constant of $\frac{\pi^2}{6}$.

Therefore, your condition is not sufficient to prove the conclusion.


By definition

$$\gamma=\sum_{n\le x}{1\over n}-\log x+o(1).$$

So we see that $$\log x +\gamma + o(1)=\sum_{n\le x}{1\over n}.$$

So it is sufficient to show that $$\sum_{n\le x}{\Lambda(n)-1\over n}=-2\gamma+o(1).$$

Then using Möbius inversion in the form

$$G(x)=\sum_{n\le x}F\left({x\over n}\right)\iff F(x)=\sum_{n\le x}\mu(n)G\left({x\over n}\right)$$

and the asymptotic estimate

$$\sum_{n\le x}\bigg(\log n-d(n)\bigg)=-2\gamma x+O(\sqrt{x})\quad (*)$$

we get the desired result. Note: you'll end up using the PNT in a $\mu$ function estimate during the Möbius inversion, this is how you use your given, and the given estimate is where you save over just using the PNT by itself. As @Phira notes: this is impossible.

Note: $(*)$ can be proved pretty easily using standard summation techniques, the first term is $\log[x]!=x\log x -x+O(\log x)$ which is just partial summation and the second is $x\log x +(2\gamma-1)x+O(\sqrt{x})$ by using Dirichlet's hyperbola method.