Solution 1:

Let $Q$ be an orthonormal matrix. By definition, orthonormal matrices are the matrices which satisfy the following conditions:

($i$) $Q^{-1}=Q^T$. From here it follows that $QQ^T=QQ^{-1}=I$ where $I$ is the identity matrix, a square matrix with $1$s at the diagonal and else $0$.

($ii$) All rows and columns of an orthonormal matrix satisfy the inner product rule $<q_i,q_j>=0$ and $<q_i,q_i>=1$.

A matrix is called orthogonal (but not necessarily orthonomal) when $<q_i,q_j>=0$ holds. This means the angle beween two pairs, $i$ and $j$ of vectors from $Q$ with $i\neq j$ is $90$ degrees. This follows simply from $$\cos(\theta)=\frac{<u_i,u_j>}{||u_i||||u_j||}$$ here $||u_i||=<u_i,u_i>$ is the inner product in Euclidean space ($L^2$ norm).

Absolute value of the determinant of every orthonormal matrix is always $1$. This can be proven at least in two different ways:

$1$ st way: $$1=\det(I)=\det(QQ^T)=\det(Q)\det(Q^T)=(\det(Q))^2$$ The third equality is a result of the determinant of two square matrices here the 4th property.

$2$ nd way:

($i$) For every orthonormal matrix, $Q$, all singular values $\sigma_i$ of this matrix are equal to $1$

($ii$) The determinant of any real matrix is given by $|\det(A)|=\prod_i \sigma_i^2$

from ($i$) and ($ii$) we conclude that $|\det(A)|=(\prod_i 1)^2\Longrightarrow |\det(A)|=1$.

The proof of ($i$):

For every real matrix $Q$ we have the singular value decomposition given by $Q=U\Sigma V^T$ where $U$ and $V$ are orthonormal matrices. See here. One can select $U=Q$ and $V=I$. From here we get $\Sigma=I$. Since $\Sigma$ is uniquely determined for singular value decomposition, the proof is complete. Note that $U$ and $V$ are not unique.

The proof of ($ii$):

For this proof one can see either this, proposition C.3.7 or this question and the following answers.

To show that $J_{n,k}$ forms an orthonormal basis is easy to justify by the given properties of orthonormal matrices. Therefore, we can eventually conclude that $|\det J|=1$.