This is just about the $n=2$ case.

If $e_1,e_2$ is the standard basis of $V=\mathbb{R}^2$, then the matrix of $R_\theta\otimes R_\theta$ on $V\otimes V$ with respect to the ordered basis $e_1\otimes e_1,e_1\otimes e_2,e_2\otimes e_1,e_2\otimes e_2$ is readily computed (see the answer by @Jean Marie) as $$R_\theta\otimes R_\theta=\begin{pmatrix} c^2& -cs& -cs& s^2\\ cs & c^2& -s^2& -cs\\ cs & -s^2& c^2& -cs\\ s^2 & cs & cs& c^2\end{pmatrix}$$ where $c:=\cos\theta$, $s:=\sin\theta$.

The $1$-eigenspace of dimension $2$ has basis $e_1\otimes e_1 +e_2\otimes e_2, e_1\otimes e_2 -e_2\otimes e_1$: to verify this just add/subtract the relevant rows/columns of the matrix.

The $2$ dimensional space ('plane') on which $R_\theta\otimes R_\theta$ acts as rotation by $2\theta$ must be the orthogonal complement of the $1$-eigenspace, that is the space with basis $e_1\otimes e_1 -e_2\otimes e_2, e_1\otimes e_2 +e_2\otimes e_1$. Again this is easily verified by adding/subtracting the relevant rows/columns of the matrix.

Comment 1 In fact the same argument (with the same bases) would let us identify the two 'planes' for $R_\theta\otimes R_\phi$: the one on which it acts as rotation by $\theta+\phi$, and the one on which it acts as rotation by $\theta-\phi$. This in principle gives a recursive way of building up appropriate bases/planes for $(R_\theta)^{(n+1)}=(R_\theta)^{(n)}\otimes R_\theta$.

Comment 2 If one attempts to deal with $R_\theta^{(n)}$ directly it seems to me that the Chebychev polynomials have something to do with it.


Let $c=\cos \theta, s=\sin \theta$. We have

$$Q_{\theta}=\begin{pmatrix} c^2& -cs& -cs& s^2\\ cs & c^2& -s^2& -cs\\ cs & -s^2& c^2& -cs\\ s^2 & cs & cs& c^2\end{pmatrix}$$

Due to the fact that $Q_{\theta}Q_{\theta'}=Q_{\theta+\theta'}$, the set of (orthogonal) matrices $Q_{\theta}$ constitutes a commutative group, subgroup of $SO(4,\mathbb R)$.

Using a property of the Kronecker product, the eigenvalues of $Q_{\theta}$ are the product in all possible manners of an element of the set of eigenvalues $\{e^{i \theta}, e^{-i \theta}\}$ of $R_{\theta}$ by itself.

Conclusion: the eigenvalues of $Q_{\theta}$ are

$$1 \ \ \text{(multiplicity 2) and } \ e^{\pm 2 i \theta}$$

The eigenspace associated with eigenvalue $\color{red}{1}$ is $2$-dimensional (it could have been 1-dimensional only). Indeed, in:

$$Q_{\theta}-\color{red}{1}I_4=\begin{pmatrix} -s^2& -cs& -cs& s^2\\ cs & -s^2& -s^2& -cs\\ cs & -s^2& -s^2& -cs\\ s^2 & cs & cs& -s^2\end{pmatrix},$$

line 2 and line 3 are identical, and line 1 and line 4 are opposite, giving rank$(Q_{\theta}- I_4) = 2$.

I leave you the computation of the eigenspaces associated with the complex eigenvalues.

Remark: if $t:=\dfrac{s}{c}=\tan \theta$, matrix $Q_{\theta}$ can be written under a slightly simplified form:

$$Q_{\theta}=c^2 \begin{pmatrix} 1& -t& -t& t^2\\ t & 1& -t^2& -t\\ t & -t^2& 1& -t\\ t^2 & t & t& 1\end{pmatrix}$$

Edit: At the suggestion of @ancient mathematician, I have attempted to go to the next step, taking a double Kronecker product,

$$Q=R_{\theta}\otimes R_{\phi}\otimes R_{\nu}$$

but this time with (three) different angles. For the sake of simplicity, I have set:

$$\begin{cases}C&=&\cos \theta \\ S&=&\sin \theta \end{cases}, \ \ \begin{cases}c&=&\cos \phi\\ s &=&\sin \phi \end{cases}, \ \ \begin{cases}g&=&\cos \nu\\ z &=&\sin \nu \end{cases}$$

giving:

$$Q=\left(\begin{array}{rrrr|rrrr} Ccg &-Ccz&-Cgs& Csz&-Scg& Scz& Sgs&-Ssz\\ Ccz& Ccg&-Csz&-Cgs&-Scz&-Scg& Ssz& Sgs\\ Cgs&-Csz& Ccg&-Ccz&-Sgs& Ssz&-Scg& Scz\\ Csz& Cgs& Ccz& Ccg&-Ssz&-Sgs&-Scz&-Scg\\ \hline Scg&-Scz&-Sgs& Ssz& Ccg&-Ccz&-Cgs& Csz\\ Scz& Scg&-Ssz&-Sgs& Ccz& Ccg&-Csz&-Cgs\\ Sgs&-Ssz& Scg&-Scz& Cgs&-Csz& Ccg&-Ccz\\ Ssz& Sgs& Scz& Scg& Csz& Cgs& Ccz& Ccg \end{array}\right)$$

This $2^3 \times 2^3$ matrix has identical diagonal entries equal to the product $Ccg.$ of the three cosines. Therefore its trace, which is also the sum of its eigenvalues is $ 8 \times Csg$.

But, besides, we know that the eigenvalues of the three components are

$$(e^{i \alpha},e^{-i \alpha}), (e^{i \phi},e^{-i \phi}), (e^{i \nu},e^{-i \nu})$$

the eigenvalues are all the possible products of 3 elements each one being taken in one of the parentheses ; therefore the trace of the "big matrix" above is given by the product

$$(e^{i \alpha}+e^{-i \alpha}) (e^{i \phi}+e^{-i \phi}) (e^{i \nu}+e^{-i \nu})$$

which is, non surprisingly

$$2 \cos \alpha \ 2 \cos \phi \ 2 \cos \nu = 8 Ccg$$


Based on the insights given by the other contributors, I attempt to provide an answer to the original question for the general case of $R_\theta\, ^{\otimes n}$.

Eigenvalues. The two eigenvalues of $R_\theta$ are respectively: $\lambda_0 = e^{i\theta}$ and $\lambda_1 = e^{-i\theta}$. We know that the eigenvalues of the matrix resulting from a Kronecker product are given by the products of the eigenvalues of the multiplicand matrices. More precisely, if we map the eigenvalues to the coefficients of a polynomial in two variables $x_0,x_1$, we have that the eigenvalues of $R_\theta\, ^{\otimes n}$ are given by the coefficients in the binomial expansion of $(\lambda_0 x_0 + \lambda_1 x_1)^n$, which yields:

$$\sum_{k=0}^n \binom{n}{k} e^{i(n-2k)\theta}x_0\,^{n-k}x_1\,^k$$

From the above expression we can immediately see that:

  1. when $n$ is odd, there is no real eigenvalue equal to 1, and there are 2$\binom{n}{n-1}$ eigenvalues that are $e^{\pm i\theta}$.
  2. when $n$ is even, there are $\binom{n}{n/2}$ eigenvalues equal to 1, but there can't be any eigenvalue $e^{\pm i\theta}$.

Observation 2 is consistent with what was already demonstrated in the case $n=2$, in which $R_\theta\,^{\otimes 2}$ had one eigenplane, and acted on another eigenplane perpendicular to it through double-angle rotations.

Eigenvectors. The two eigenvectors of $R_\theta$ corresponding to $\lambda_0 ,\lambda_1$ are respectively: $u_0 = \begin{bmatrix}1\\ i\end{bmatrix}$ and $u_1 = \begin{bmatrix}1\\ -i\end{bmatrix}$. We know that the eigenvectors of a matrix resulting from a Kronecker product are given by the Kronecker product of the eigenvectors of the multiplicand matrices (see Theorem 13.12 in [1]). However, before taking multiple Kronecker products of $u_0$ and $u_1$ it is convenient to re-express them in the following way:

$$u_0 = u_0 \odot c_0 = u_0 \odot \begin{bmatrix}1\\ 1\end{bmatrix}$$ $$u_1 = u_0 \odot c_1 = u_0 \odot \begin{bmatrix}1\\ -1\end{bmatrix}$$

where $\odot$ denotes the Hadamard (pointwise) product. Note that the Kronecker product satisfies the mixed product property with the Hadamard product, i.e. $(A\otimes B)\odot(C\otimes D)=(A\odot C)\otimes(B\odot D)$. This allows us to handle more easily the following calculation. Consider an arbitrary eigenvector of $R_\theta\,^{\otimes n}$:

$$u_{b_1\ldots b_n}:=u_{b_1} \otimes \ldots \otimes u_{b_n}$$ where the binary $n$-tuple $(b_1,\ldots,b_n)\in\{0,1\}^n$ is a multi-index. Using the mixed-product property, the above equation can be re-written as:

$$\begin{array}{lll} u_{b_1}\otimes\ldots\otimes u_{b_n} & = & (u_0\odot c_{b_1})\otimes\ldots\otimes (u_0 \odot c_{b_n})\\ & = & (u_0\,^{\otimes n}) \odot (c_{b_1}\otimes\ldots\otimes c_{b_n})\\ & = & (u_0\,^{\otimes n}) \odot c_{b_1\ldots b_n}\\ \end{array}$$

It is interesting to note that if we consider the binary number $b=(b_1\ldots b_n)_2$ then the vector $c_{b_1\ldots b_n}$ is exactly the $(b+1)$-th column of the naturally ordered Walsh matrix $W(n)$. Thus, the problem is now reduced to expanding the Kronecker power $u_0\,^{\otimes n}$. Let us write $u_0 = e_1 + ie_2$, and introduce the notation $a^j b^k$ to denote the summation of all the $(j+k)$-fold Kronecker products of $a$ and $b$ in which they appear respectively $j$ and $k$ times (for example $a^1b^1 = a\otimes b + b \otimes a$). We then have that: $$\begin{array}{lll} u_0\,^{\otimes n} & = & (e_1 + ie_2)^{\otimes n}\\ & = & \sum_{k=0}^n i^k \sum e_1^{n-k}e_2^k\\ & = & \sum_{k\, \mathrm{even}} (-1)^{k/2} e_1^{n-k}e_2^k \;+\; i\sum_{k\, \mathrm{odd}}(-1)^{\frac{k-1}{2}} e_1^{n-k}e_2^k \end{array}$$

By using the last two formulas and setting $n=2$, we can easily recover the basis vectors of the eigenplanes found by Ancientmathematician. It should be relatively straightforward (though somewhat cumbersome to do manually) to find the eigenplanes for $n>2$.

[1] Laub, Alan J., Matrix analysis for scientists and engineers, Philadelphia, PA: SIAM (ISBN 0-89871-576-8/pbk). xiii, 157 p. (2005). ZBL1077.15001.)