Jacobians when integrating over symmetric and antisymmetric matrices

I'm interested in integrals over matrix elements. For integrals over the elements of a symmetric matrix $S$ or an antisymmetric $A$, how can I find the Jacobian of the transformations $S \rightarrow B^T S B$ or $A \rightarrow B^T A B$, where B is some real invertible matrix? Below shares my work and my guesses for what the Jacobians should be.


Here's a warmup that shows how I personally would find the Jacobian corresponding to a linear transformation of my matrix of interest.

Consider an integral over the $d$ by $d$ dimensional matrix $X$. That is, consider the integral $$\int_{\Gamma} f(X) dX .$$ $f(X)$ is schematic for a function of all the matrix elements. Here, $dX$ is schematic for $\prod_{i,j}^d dX_{ij}$. $\Gamma$ is some $d^2$-dimensional integration region.

If I were to consider a real, invertible transformation $U=BX$, one can quickly see by considering the columns of $X$ that $$\int_{\Gamma} f(X) dX = \int_{\Gamma} f(X) \prod_{j=1}^d \Big(\prod_{i=1}^d dX_{ij}\Big) = \int_{B\Gamma} f(B^{-1}X) \prod_{j=1}^d \frac{\prod_{i=1}^d dU_{ij}}{\det(B)} = \int_{B\Gamma} f(B^{-1}X) \frac{\ dU}{\det(B)^d}.$$ That is, we have a Jacobian not of $\det(B)$ but of $\det(B)^d$.

One can check this makes sense by considering $B$ a constant $c$ times the identity matrix - then $\det(B) = c^d$, and $\det(B)^d = c^{d^2}$, which is exactly what one would get by scaling each of the $d^2$ elements of the matrix $X$ by $c$.

Thus, for $U=BX$, we have $dX \rightarrow \frac{dU}{\det(B)^d}$.


Similarly, we can see that $$\text{for }U=B^TXB\text{, we have }dX \rightarrow \frac{dU}{\det(B)^{2d}}.$$ This can be seen quickly by performing the transformation in two steps and considering the columns and rows of $X$.

However, what if our integration was over a symmetric matrix $S$? That is, consider the integral $\int_{\Delta} f(S)dS.$ Here, $dS$ is schematic for $\prod_{i\geq j}^d dS_{ij}$. Here, $\Delta$ is the integration region; counting the number of independent components of a symmetric matrix gives us that $\Delta$ must be a $\frac{1}{2}d(d+1)$-dimensional region.

Note that the transformation $U=B^TSB$ preserves the symmetry of $S$: $U$ is also symmetric. Then, by counting the number of independent components, $\frac{1}{2}d(d+1)$, I'd guess that

$$\text{for } B^TSB\text{, we have } dS \stackrel{?}{\rightarrow} \frac{dU}{\det(B)^{d+1}}.$$

Similarly, for an antisymmetric matrix $A$, we have that the transformation $U=B^T A B$ preserves the asymmetry; $U$ is also antisymmetric. Defining $dA = \prod_{i>j}^d dA_{ij}$, and noting $A$ has $\frac{1}{2}d(d-1)$ independent components, I'd guess that

$$\text{for } B^TAB\text{, we have } dA \stackrel{?}{\rightarrow} \frac{dU}{\det(B)^{d-1}}.$$


Are these guesses correct? How can I formally find the Jacobians of the transformations $U = B^T SB$ and $U=B^T AB$?


The transformation $f(S) = B^T S B$ is a polynomial transformation, in the sense that every matrix entry on the right is a polynomial in the entries of $S$, with coefficients depending on $B$. In fact it is even a linear transformation, meaning that its Jacobian is itself. This is easy to see by picking some basepoint $P$ and making a small perturbation in the direction $A$:

$$ f(P + tA) = B^T P B + t B^T A B = f(P) + t B^T A B,$$

and hence $df_P(A) = B^T A B$ is the Jacobian at the point $P$.

The above working makes sense for any matrices $P$ and $A$. If you want to restrict to the linear subspace of symmetric or antisymmetric matrices, everything still works (because the tangent space to a linear subspace is itself, therefore all that changes in the above working is that both $P$ and $A$ are symmetric, or antisymmetric).