If $(A-B)^2=AB$, prove that $\det(AB-BA)=0$.

Let $A,B\in M_{n}(\mathbb{Q})$. If $(A-B)^2=AB$, prove that $\det(AB-BA)=0$.

I considered the function $f:\mathbb{Q}\rightarrow \mathbb{Q}$, $f(x)=\det(A^2+B^2-BA-xAB)$ and I obtained that: $$f(0)=\det(A^2+B^2-BA)=\det(2AB)=2^n\det(AB)$$ $$f(1)=\det(A^2+B^2-BA-AB)=\det((A-B)^2)=\det(AB)$$ $$f(2)=\det(A^2+B^2-BA-2AB)=\det((A-B)^2-AB)=\det(AB-AB)=0$$

I don't have any other idea.


The idea is to use $(-3\pm\sqrt 5)/2$. We begin with

$$\begin{align} (A+xB)(A+x'B) &= A^2 + x BA + x' AB + xx' B^2 \\ &= A^2 + (x' + x) AB+x(BA-AB)+xx'B^2.\end{align} $$

Now we find numbers $x$, $x'$ such that $x'+x = -3$, $xx'=1$. These numbers are the roots of the quadratic equation $$ \lambda^2 +3\lambda +1 = 0. $$ Thus, we have $x = \frac{-3+\sqrt 5}2$ and $x'= \frac{-3-\sqrt 5}2$.

With these, we have $$\begin{align} (A+xB)(A+x'B) &= A^2 -3AB + B^2 + x(BA-AB) \\ &= BA-AB + x(BA-AB) = (BA-AB)(1+x).\end{align} $$ We now have that $$ \det(A+xB)(A+x'B) =q \in \mathbb{Q}, $$ and $$ q=(1+x)^n \det(BA-AB). $$ Then it follows by $A, B\in M_n(\mathbb{Q})$ and $(1+x)^n\in\mathbb{R} \backslash \mathbb{Q}$ that $\det(BA-AB)=0$.


An example with $AB\neq BA$

For the example that @Jonas Meyer requested, here it is: $$ A=\begin{pmatrix} 1 & 1 \\ 0 & \frac{-3+\sqrt 5}2\end{pmatrix}, \ \ B=\begin{pmatrix} \frac{3+\sqrt 5}2 & 1 \\ 0 & -1\end{pmatrix}. $$ Then $$ A-B= \begin{pmatrix} \frac{-1-\sqrt 5}2 & 0 \\ 0 & \frac{-1+\sqrt 5}2\end{pmatrix},$$ $$ (A-B)^2 = \begin{pmatrix} \frac{3+\sqrt 5}2 & 0 \\ 0 & \frac{3-\sqrt 5}2\end{pmatrix} = AB. $$ But, $$ BA =\begin{pmatrix} \frac{3+\sqrt 5}2 & \sqrt 5 \\ 0 & \frac{3-\sqrt 5}2\end{pmatrix} \neq AB. $$


The statement is true without the assumption that $A,B$ have rational entries. As in i707107's answer, let $$ x=\frac{-3-\sqrt{5}}2,\,x'=\frac{-3+\sqrt{5}}2 $$ so that $x+x'=-3$, $xx'=1$. Let $X=A+xB$, $Y=A+x'B$. Then $$\begin{eqnarray*} (1+x')XY-(1+x)YX &=&(x'-x)(A^2+xx'B^2)+(x'(1+x')-x(1+x))AB\\ &&{}+(x(1+x')-x'(1+x))BA\\ &=&(x'-x)\left[A^2+B^2+(1+x'+x)AB-BA\right]\\ &=&(x'-x)\left[(A-B)^2-AB\right]\\ &=&0. \end{eqnarray*}$$ Thus $XY=kYX$ where $$ k=\frac{1+x}{1+x'} $$ Note that $|k|>1$. Pick an eigenvector $v$ for $X$ whose eigenvalue $\lambda$ has maximal magnitude. Then $$ X(Yv)=kYXv=(k\lambda)(Yv). $$ If $\lambda=0$ then $YXv=XYv=0$. If $\lambda\neq0$ then $|k\lambda|>|\lambda|$, so by assumption $k\lambda$ can't be an eigenvalue of $X$. This implies $Yv=0$, so again $XYv=0$ and $YXv=\lambda Yv=0$. In either case we have $(XY-YX)v=0$, so $XY-YX$ is singular. Finally since $$ XY-YX=(x'-x)(AB-BA), $$ we conclude $AB-BA$ is singular.


To give a bit more insight, suppose we started with an arbitrary homogeneous degree 2 constraint on $A,B$: $$ c_2A^2+c_1AB+c_1'BA+c_0B^2=0. $$ If we replace $A,B$ by commuting variables $a,b$, the corresponding polynomial would factor over $\mathbb C$: $$ c_2a^2+(c_1+c_1')ab+c_0b^2=(\alpha a+\beta b)(\gamma a+\delta b). $$ Let $X=\alpha A+\beta B$ and $Y=\gamma A+\delta B$. If $A$ and $B$ commuted we'd have $XY=0$, but instead we get $$ XY=(\alpha\delta-c_1)[A,B] $$ where $[A,B]=AB-BA$ is the Lie bracket. Note that $[X,Y]=(\alpha\delta-\beta\gamma)[A,B]$, so $$ (\alpha\delta-\beta\gamma)XY=(\alpha\delta-c_1)[X,Y]. $$ Unless a coefficient happens to vanish, this gives $XY=kYX$ for some $k$. When $k$ is not a root of unity this is quite a restrictive constraint (eg $XY$ must be singular).