What is the physical significance of the determinants of orthogonal matrices having the value of $\pm 1$?
It means that orthogonal transformations preserve volumes. That is so because, if you have an object $O$ and if $A$ is a linear transformation, then the volume of $A.O$ is the volume of $O$ times the absolute value of $\det A$.
Let $A$ be an orthogonal $n\times n$-matrix. Thus, $A^T A = I_n$ (where $I_n$ denotes the $n\times n$ identity matrix). Thus, $\det\left(A^T A\right) = \det\left(I_n\right) = 1$. But any two $n\times n$-matrices $X$ and $Y$ satisfy $\det\left(XY\right) = \det X \cdot \det Y$ (this is the famed multiplicativity property of the determinant). Applying this to $X = A^T$ and $Y = A$, we obtain $\det \left(A^T A\right) = \det A^T \cdot \det A = \det A \cdot \det A$ (since yet another known property of determinants says $\det A^T = \det A$). Hence, $1 = \det\left(A^T A\right) = \det A \cdot \det A = \left(\det A\right)^2$. Subtracting $1$ from this equality, we obtain $0 = \left(\det A\right)^2 - 1 = \left(\det A - 1\right)\left(\det A + 1\right)$. But a product of two (complex or real or rational) numbers can only be $0$ if one of them is $0$. Hence, from $\left(\det A - 1\right)\left(\det A + 1\right) = 0$, we obtain that either $\det A - 1 = 0$ or $\det A + 1 = 0$. In other words, either $\det A = 1$ or $\det A = -1$.
This holds for any matrix $A$ with complex (or real or rational) entries (or generally with entries in any field). The geometric intuition ("orthogonal matrix = congruence transformation") does not hold in this generality.
OK, let me expand on my (now deleted) comment. To give a complete answer to your question, let me introduce a few concepts. Let us work in $V=\mathbb{R}^n$. I will use the notation $(x,y)$ for the inner product of $x,y\in V$ (which is $x^Ty$ in $\mathbb{R}^n$).
Householder transformations: A householder transformation is defined as follows. Fix a unit vector $v\in V$ and define the linear transformation $T:V\to V$ as $$ T_v(x) = x-2(x,v)v $$ What this transformation does is as follows: The vector $v$ defines a $(n-1)$-dimensional hyperplane $H_v$ passing through the origin which is perpendicular to $v$. Our $T_v$ reflects any other vector $x$ with respect to this hyperplane.
I leave as an exercise to show that $T_v$ is orthogonal. Furthermore, note that if $x$ lies in the hyperplane $H_v$, i.e. $(x,v)=0$, then $T_v(x)=x$. However, $T_v(v)=-v$. This means that $v$ is the unique eigenvector of $T_v$ with eigenvalue $-1$. The other eigenvalue of $T$ is $+1$ with multiplicity $n-1$ (as the dimension of $H_v$ is $n-1$). As such $\det T_v=-1$.
Another exercise you can do is to show that the combination of two householder transformations is a rotation (note that in this case $\det = +1$). HINT: Let $u,v$ are the two vectors determining these householder transformations. Define $W=\mathrm{span}\{u,v\}$. Then $V=W\oplus W^\perp$. Note that if your transformation is $U$, then $U$ leaves $W^\perp$ invariant. So your problem is essentially 2-dimensional.
Finally, consider a diagonal matrix of the form $$ M=\mathrm{diag}(\underbrace{-1, \cdots,-1}_{m\text{ times}}, \underbrace{+1, \cdots,+1}_{n-m\text{times}})$$ if this is the matrix of a transformation $U$ in the standard basis $e_1, \cdots, e_n$, and $T_i=T_{e_i}$ is aforementioned householder transformation, then $$ U= T_m\circ \cdots \circ T_2\circ T_1 $$ More generally, suppose $U$ is any orthogonal transformtion, $M$ its matrix. Define the new orthonormal basis $b_i=U(e_i)$. Note that we can always find a rotation $R$, which sends $e_i$ either to $b_i$ or $-b_i$ (in other words the $x_i$ axis goes to $b_i$ axis). In the new basis $b_i$, the matrix is of the above form. In other words, combining everything we learned:
Any orthogonal transformation, is a combination of rotations and householder transformations (i.e. reflections). One can take this even further to: if $\det =+1$, the transformation has even number of reflections. If $\det =-1$ then the transformation has odd number of reflections. You can even work this further and show that all of this simplifies to $\det =+1\Longrightarrow$ one rotation, $\det =-1\Longrightarrow$ one reflection followed by one rotation.
Orthogonal matrices/transformations are essentially the mathematical way to speak about rotations (and/or reflections). This physical interpretation gives an easy way to see that they should have $\det(O)=\pm 1$.
The special thing about (certain) rotations (and/or reflections) in contrast to general linear transformations is, that by applying them multiple times, you come back to the identitiy transformation. E.g. turning to the left four times (by $45^\circ$) brings you back to your original orientation. Mathematically, this means $O^n=\mathrm{Id}$, where $\mathrm{Id}$ is the identity transformation that does nothing, and $n$ is the number of turns you have to make to be in the original orientation again. Now use some properties of the determinant:
$$\det(O)^n=\det(O^n)=\det(\mathrm{Id})=1.$$
This leaves us with no option besides $\det(O)$ being a root of unity, which can only be $\pm1$ in the real numbers.
The reasoning explained above only works for rotations with angles $2\pi/n$ for some $n\in\Bbb N$. However, it can be extended to rotations around arbitrary rational or irrational angles.