Is there an easy way to find the sign of the determinant of an orthogonal matrix?

It depends on what "easy way" means. There is no known shortcut for determinants of orthogonal matrices, but most known algorithms will run faster for them. This is not detected by simple complexity estimates in terms of matrix size only. Such estimates assume that arithmetic operations are performed in constant time regardless of the size of numbers involved. This exactly ignores the advantage that orthogonal matrices have over worse conditioned ones.

If one takes into account the size of numbers appearing in computations using the bit cost, then instead of $O(n^3)$ one gets more nuanced $O^{\sim}\big(n^3(1+\log(\Delta(A)\|A\|))\big)$, where $\Delta(A)$ is the orthogonality defect of $A$ (soft $O^{\sim}$ means that logarithmic terms are not shown). This shows a clear advantage for orthogonal matrices, in general $\log(\Delta(A)\|A\|)$ can be $O^{\sim}(n\log\|A\|)$.

The $O(n^3)$ baseline comes from algorithms using standard methods and can be improved. Strassen proved that complexity of computing determinants is equivalent to that of matrix multiplication. The best known multiplication algorithm is Le Gall's with complexity $O(n^{2.3728639})$, better than the classical $O(n^3)$. But it seems to be unknown if the best exponent for orthogonal matrices is strictly smaller than for general ones. Even if it is not computation for them is still easier in bit cost.