Contradicting Fubini's theorem

I have a function defined as follows: $f(x,y)= \dfrac{x^2-y^2}{\left(x^2+y^2\right)^2}$, if $(x,y)\neq (0, 0)$ and $f(x,y)=0$ if $(x,y)=(0,0)$. Now, $$\int_0^1\int_0^1 f(x,y)\,\text{d}x~\text{d}y=-\frac{\pi}{4}$$ and $$\int_0^1\int_0^1 f(x,y)\,\text{d}y~\text{d}x=\frac{\pi}{4}.$$ The question I have is this: why does this not contradict Fubini's theorem?

thanks.


Because the integrals of the positive and negative parts are both infinite.

If $x>y$ then $(x^2-y^2)/(x^2+y^2)^2$ is positive; if $x<y$, it is negative. The boundary is the diagonal line $x=y$, dividing the square into two halves. Integrating this function over the half where the function is positive yields $+\infty$; integrating over the other half yields $-\infty$.

It's the same thing with infinite series: If the sum of the positive terms is $+\infty$ and the sum of the negative terms is $-\infty$, and the series adds up to some finite number, then you can make it add up to a different finite number by rearranging the terms.

If a function is always positive, and its integral is $\infty$, the rearranging it won't change the fact that its integral is $\infty$. Therefore, if one looks at the absolute value $$ \int_0^1\int_0^1 \left|\frac{x^2-y^2}{(x^2+y^2)^2}\right| \;dy\;dx, $$ it will remain $+\infty$ no matter how you rearrange it. And that is precisely the circumstance in which Fubini's theorem does not apply.