Asymptotic formula for $\sum_{n\leq x}\mu(n)[x/n]^2$ and the Totient summatory function $\sum_{n\leq x} \phi(n)$
I would like to show (for $x \ge 2$) that $$\sum_{n \le x}\mu(n)\left[\frac{x}{n}\right]^2 = \frac{x^2}{\zeta(2)} + O(x \log(x)).$$
I already have the identity $$\sum_{n \le x}\mu(n)\left[\frac{x}{n}\right] = 1$$ but the method to show this does not extend to my case.
I don't know how to approach this problem so any advice would be very welcome!
Since $$\left[\frac{x}{n}\right]^2 = \left(\frac{x}{n}\right)^2 + O(\frac{x}{n})$$ we have $$\sum_{n \le x}\mu(n)\left[\frac{x}{n}\right]^2 = \sum_{n \le x} \mu(n) \left(\frac{x}{n}\right)^2 + O(\sum_{n \le x} \mu(n) \frac{x}{n})$$ and the value estimate is $\frac{1}{\zeta(2)}$ so it remains to show that the error estimate is $O(x \log(x))$.
Solution 1:
Since sos440 provided the answer in the comments, a natural question is how much better can we do?
Edit: Recently, I actually had to do the contour integration and use Perrons formula as I previously suggested. It turns out that improving the error term is quite a difficult problem, and you cannot do much better then $x\log x$.
Relation to the Sum of Euler's Totient Function:
We can show that $$\sum_{n\leq x}\mu(n)[x/n]^2=-1+2\sum_{n\leq x}\phi(n)$$ so that analyzing your sum is equivalent to analyzing the totient summatory function.
The History Of The Error Term For The Totient Sum
In 1874, Mertens proved the result you wanted above, namely that $$\sum_{n\leq x}\phi(n)=\frac{3}{\pi^{2}}x^{2}+O\left(x\log x\right).$$ Throughout we use the notation $\Phi(x)=\sum_{n\leq x}\phi(n)$ for the summatory function, and $E(x)=\Phi(x)-\frac{3}{\pi^{2}}x^{2}$ for the error function.
The best unconditional result is given by Walfisz 1963: $$E(x)\ll x\left(\log x\right)^{\frac{2}{3}}\left(\log\log x\right)^{\frac{4}{3}}.$$
In 1930, Chowla and Pillai showed this cannot be improved much more, and that $E(x)$ is not $$o\left(x\log\log\log x\right).$$
In particular, they showed that $\sum_{n\leq x}E(n)\sim\frac{3}{2\pi^{2}}x^{2}$ so that $E(n)\asymp n$ on average. In 1950, Erdos and Shapiro proved that there exists $c$ such that for infinitely many positive integers $N,M$ we have $$E(N)>cN\log\log\log\log N\ \ \text{and}\ \ E(M)<-cM\log\log\log\log M, $$
or more concisely
$$E(x)=\Omega_{\pm}\left(x\log\log\log\log x\right).$$
In 1987 Montgomery improved this to
$$E(x)=\Omega_{\pm}\left(x\sqrt{\log\log x}\right).$$
Sum of Relatively Prime Pairs in the Grid: Lastly, it is interesting to notice that $$\sum_{d\leq x}\mu(d)\left[\frac{x}{d}\right]^{2}=\sum_{d\leq x}\mu(d)\left(\sum_{s\leq\frac{x}{d}}1\right)\left(\sum_{r\leq\frac{x}{d}}1\right)$$
$$\sum_{d\leq x}\sum_{r,s\leq\frac{x}{d}}\mu(d)=\sum_{m,n\leq x}\sum_{d|\gcd(m,n)}\mu(d)=\sum_{\begin{array}{c} m,n\leq x\\ \gcd(m,n)=1\end{array}}1$$which is the number of relatively prime pairs of integers both of which are less then $x$.
Hope that helps,
Added: At some point, I wrote a long blog post about this, complete with a proof of Montgomery's lower bound.
Solution 2:
So you have \begin{align*} \sum\limits_{n \leq x} \mu(n)\biggl[\frac{x}{n}\biggr]^{2} &= \sum\limits_{n \leq x} \mu(n)\cdot\biggl(\frac{x}{n}-\biggl\{\frac{x}{n}\biggr\}\biggr)^{2} \\ &= x^{2}\sum\limits_{n \leq x }\frac{\mu(n)}{n^{2}} -2x \cdot\mathcal{O}(\log{x}) + \mathcal{O}(x) \end{align*}
To find the first sum note that: \begin{align*}x^{2} \sum\limits_{n \leq x } \frac{\mu(n)}{n^{2}} &= x^{2}\sum\limits_{n=1}^{\infty} \frac{\mu(n)}{n^{2}} - x^{2}\sum\limits_{n > x} \frac{\mu(n)}{n^{2}} \\ &= \frac{x^{2}}{\zeta(2)} + \mathcal{O}\biggl(x^{2}\sum\limits_{n >x}\frac{1}{n^{2}}\biggr) \\ &= \frac{x^{2}}{\zeta(2)} + \mathcal{O}\Biggr(x^{2}\Biggl[\sum\limits_{n=1}^{\infty} \frac{1}{n^{2}}-\sum\limits_{n \leq x } \frac{1}{n^{2}}\Biggr]\Biggr) \\ &= \frac{x^2}{\zeta(2)} + \mathcal{O}\biggl(\frac{x^{2}}{\zeta(2)} + \sum\limits_{n \leq x} \frac{1}{n^{2}}\biggr) \end{align*}
and finally note that for $s > 0$ $$\sum\limits_{n \leq x} \frac{1}{n^s} = \frac{x^{1-s}}{1-s} + \zeta(s) + \mathcal{O}(x^{-s})$$