Given that $X,Y$ are independent $N(0,1)$ , show that $\frac{XY}{\sqrt{X^2+Y^2}},\frac{X^2-Y^2}{2\sqrt{X^2+Y^2}}$ are independent $N(0,\frac{1}{4})$
If you transform $(X,Y)\mapsto(R,\Theta)$ where $X=R\cos\Theta,Y=R\sin\Theta$,
you should end up with the joint density of $(R,\Theta)$ as $f_{R,\Theta}(r,\theta)=\dfrac{r}{2\pi}e^{-r^2/2}\mathbf1_{\{r>0,\,0<\theta<2\pi\}}$.
This implies $R$ and $\Theta$ are independent, where $R$ has the Rayleigh distribution and $\Theta\sim\mathcal{U}(0,2\pi)$.
Now changing variables $(R,\Theta)\mapsto(U,V)$ such that $U=R\sin(2\Theta),V=R\cos(2\Theta)$,
you should be able to show that $U$ and $V$ are independent $\mathcal{N}(0,1)$ variables.
Note that $U=\dfrac{2XY}{\sqrt{X^2+Y^2}}$ and $V=\dfrac{X^2-Y^2}{\sqrt{X^2+Y^2}}$ are independent, which in turn means that
$\dfrac{U}{2}=\dfrac{XY}{\sqrt{X^2+Y^2}}$ and $\dfrac{V}{2}=\dfrac{X^2-Y^2}{2\sqrt{X^2+Y^2}}$ are independent $\mathcal{N}(0,1/4)$ variables.
This is independent of the post above:
Joint density of $(X,Y)$ is $\displaystyle f_{X,Y}(x,y)=\frac{1}{2\pi}e^{-\frac{1}{2}(x^2+y^2)}\,\quad,(x,y)\in\mathbb{R^2}$
We transform $(X,Y)\mapsto(R,\Theta)\mapsto(U,V)$ where
$x=r\cos\theta\,,y=r\sin\theta$ and $u=\frac{r}{2}\sin(2\theta)\,,v=\frac{r}{2}\cos(2\theta)$
$(x,y)\in\mathbb{R^2}\implies r>0\,, 0<\theta<2\pi\implies (u,v)\in\mathbb{R^2}$.
Note that this transformation is not one to one.
Jacobian of the transformation is $J\left(\frac{x,y}{u,v}\right) = J\left(\frac{x,y}{r,\theta}\right)J\left(\frac{r,\theta}{u,v}\right)=J_1J_2$, say.
Also, $x^2+y^2=r^2=4(u^2+v^2)$ and $|J_1||J_2|=r\times\frac{2}{r}=2$
Now $\left(U=\frac{XY}{\sqrt{X^2+Y^2}},V=\frac{X^2-Y^2}{2\sqrt{X^2+Y^2}}\right)$ has the preimages $(X,Y)$ and $(-X,-Y)$.
Moreover, $X,Y\stackrel{\text{i.i.d.}}{\sim}\mathcal{N}(0,1)\iff -X,-Y\stackrel{\text{i.i.d.}}{\sim}\mathcal{N}(0,1)$.
Hence the joint density of $(U,V)$ is given by $$f_{U,V}(u,v)=f_{X,Y}(g_1(u,v),h_1(u,v))|J_1||J_2| + f_{X,Y}(g_2(u,v),h_2(u,v))|J_1||J_2|$$
$$=\frac{1}{2\pi}e^{-\frac{1}{2} 4(u^2+v^2)}|J_1||J_2|\times 2$$
$$=\frac{1}{\sqrt{\frac{1}{4}}\sqrt{2\pi}}\exp\left(-\frac{u^2}{2\cdot\frac{1}{4}}\right)\cdot\frac{1}{\sqrt{\frac{1}{4}}\sqrt{2\pi}}\exp\left(-\frac{v^2}{2\cdot\frac{1}{4}}\right)\quad ,(u,v)\in\mathbb{R^2}$$
(We have the multiplier $2$ in the second step due to the preimages of $(x,y)$ namely $(g_i(u,v),h_i(u,v))$ for $i=1,2$, 'contributing' equally to the joint density).
This implies $U$ and $V$ are independent $\mathcal{N}(0,1/4)$ variables.
Setting $U=\dfrac{XY}{\sqrt{X^2+Y^2}}, V=\dfrac{X^2-Y^2}{2\sqrt{X^2+Y^2}}$ and using moment generating functions:
\begin{align}
M_{(U,V)}(u,v) & =\mathbb{E}\left[\mathrm{e}^{\langle\,(u,v)\,;\,(U,V)\,\rangle}\right] \\[10pt]
& =\iint_{\mathbb{R}^2}\exp\left(u\frac{xy}{\sqrt{x^2+y^2}}+v\frac{x^2-y^2}{2\sqrt{x^2+y^2}}\right)\cdot f_{(X,Y)}(x,y)\,\mathrm{d}x\,\mathrm{d}y.
\end{align}
You can see here that $$M_{(U,V)}(u,v)=\exp\left(\frac18(u^2+v^2)\right)=\exp\left(\frac12(u,v)
\begin{pmatrix}1/4 & 0\\ 0 & 1/4\end{pmatrix}
(u,v)^T\right)$$
so that $(U,V)$ is normal multivariate with $\mathbf{0}$ mean and diagonal covariance matrix, meaning $U, V$ are independant $\mathcal{N}(0,1/4)$.
If you have already proved $\frac{X Y}{\sqrt{X^2 + Y^2}} $ and $\frac{X^2 - Y^2}{\sqrt{X^2 + Y^2}}$ are gaussian, as long as that pair of them is jointly gaussian, then you may use their property: $u,\, v \text{ independent} \iff \operatorname{cov}(u,v) = 0$.
$$ \operatorname{cov} \left( \frac{X Y}{\sqrt{X^2 + Y^2}},\frac{X^2 - Y^2}{\sqrt{X^2 + Y^2}} \right)$$ $$ = \operatorname{cov} \left( \frac{X Y}{\sqrt{X^2 + Y^2}},\frac{X^2}{\sqrt{X^2 + Y^2}} \right) - \operatorname{cov} \left( \frac{X Y}{\sqrt{X^2 + Y^2}},\frac{Y^2}{\sqrt{X^2 + Y^2}} \right)$$
Now make use of the symmetry of the expressions we have: $$ \operatorname{cov}(\cdots) = 0 $$
To make this more rigorous one may rewrite the covariance as follows: $$ \operatorname{cov} \left( \frac{X Y}{\sqrt{X^2 + Y^2}},\frac{X^2}{\sqrt{X^2 + Y^2}} \right) - \operatorname{cov} \left( \frac{X Y}{\sqrt{X^2 + Y^2}},\frac{Y^2}{\sqrt{X^2 + Y^2}} \right) $$ $$ = E \left( \frac{X^3 Y}{X^2 + Y^2} \right) - E \left( \frac{X Y^3}{X^2 + Y^2} \right) $$ Now renaming $X \to Y, \, Y \to X$ under the first expectation sign (note that this is the same as renaming variables under integral) we get the result.
Covariance method can be carried even further, following the last line and the trick $ X = -X $ ($X$ is symmetric distribution) which we apply to the second $E$: $$ = E \left( \frac{X^3 Y}{X^2 + Y^2} \right) - E \left( \frac{X Y^3}{X^2 + Y^2} \right) = E \left( \frac{X^3 Y}{X^2 + Y^2} \right) + E \left( \frac{X Y^3}{X^2 + Y^2} \right) $$ $$ = E \left( \frac{X Y(X^2 + Y^2)}{X^2 + Y^2} \right) = E(XY) = 0$$
The last equation is due to independence of $X, Y$ $\implies 0=\operatorname{cov}(X,Y)=E(XY) - E(X)E(Y)=E(XY).$