Why doesn't multiplying square roots of imaginary numbers follow $\sqrt{a} \times \sqrt{b} = \sqrt{ab}$?
Solution 1:
$\sqrt\cdot$ is well defined on $\Bbb R_+$ because $\Bbb R$ has a total order, which makes us able to make a sensible choice between the square roots of $b$ (ie $\sqrt b$ and $-\sqrt b$), that is, take the positive one. However in $\Bbb C$, you no longer can compare numbers ($3i \le 5$ makes no sense, for instance).
Therefore in your example $i\sqrt 2 \times i\sqrt 3 = -\sqrt 6$ is in fact a square root of $6$, but it is not the square root of $6$ as we like to call it in $\Bbb R$. In $\Bbb R$ since the product of two positive numbers is positive, we do not have this issue.
Solution 2:
It all comes from how we define the square root of something. In complex analysis, we have that every number can be written as $re^{i\theta} = re^{i(\theta +2\pi)}$. This comes from Euler's identity and writing the number in polar coordinates.
As a result, since $e^{i\theta}$ is periodic with period $2\pi$, when we define the logarithm of a number we need to either make it discontinuous or define it only on the plane minus a ray from the origin. We choose the second one since it makes our lives much easier.
Now, to define square roots in the complex plane, we let $\sqrt{z}=e^{\frac{\log(z)}{2}}$. Now again since $e^{i\theta}$ has period $2\pi$, the identity $\sqrt{zw}=\sqrt{z}\sqrt{w}$ is true modulo $2\pi$ of the angle. Also the ray you remove from the origin needs to be the same for all the terms involved to have this identity, which for positive real numbers is the negative real axis, and for negative real numbers is the positive real axis usually. So another problem is you are switching what is called the branch of your logarithm.