Every matrix $R \in SO(n)$ is connected to a matrix the form $\begin{bmatrix} 1 & 0 \\ 0 & R_1 \end{bmatrix}$ Where $R_1 \in SO(n-1)$
Show that every matrix $R \in SO(n)$ is connected to a block matrix the form \begin{bmatrix} 1 & 0 \\ 0 & R_1 \end{bmatrix} where $R_1 \in SO(n-1)$.
This is a part of problem #13 from chapter one of Brian Halls book in Lie Theory. I started off by showing that for any unit vector $v \in \mathbb{R}^2$, there is a path in $R(t) \in SO(2)$ such that $R(0)=I$ and $R(v)=e_1$. This was easy to do because I knew the explicit form of matrices in $SO(2)$.
Then I did the same thing in $\mathbb{R}^3$. That is, for a unit vector $v \in \mathbb{R}^3$ there is a path $R(t) \in SO(3)$ such that $R(0)=I$ and $R(1)v = e_1$. For this, I let $\theta$ be the angle between $v$ and $e_1$ and then I argued that by rotating $v$ by an angle of $\theta$ about $\frac{v \times e_1}{|v \times e_1|}$ then you would land on $e_1$. Using this idea, it is easy to make the required path in $SO(3)$.
I want to do the same thing for a unit vector in $SO(n)$ and $e_1 \in \mathbb{R}^n$. I could wave my hands a bit and argue that the group of rotations of $S^{n-1}$ is transitive. I wouldn't know a more explicit way to do it since In the general case I do not know the general form a matrix in $SO(n)$ and also do not have the cross product on $\mathbb{R}^n$ (like I did for $n=2$ and $n=3$).
Anyway, assuming that for any $v \in \mathbb{R}^n$ with $|v|=1$ there exists a path $R(t) \in SO(n)$ with $R(0)=I$ and $R(1)v = e_1$, I'm trying to figure out what to do next.
A result from linear algebra is that if $A \in O(n)$, then there is an orthonormal basis of $\mathbb{R}^n$ in which the matrix representation of $A$ contains blocks of the form $$ \begin{pmatrix} c & -s \\ s & c \\ \end{pmatrix}, \hspace{20pt} c^2 + s^2 = 1 $$ and perhaps a $p \times p$ identity matrix block $I_p$ and a $q \times q$ negative identity matrix block $-I_q$. See page 113-114 of https://mtaylor.web.unc.edu/wp-content/uploads/sites/16915/2018/04/linalg.pdf for a proof.
With this characterization of $O(n)$, it is simple to show that the matrix exponential $\text{Exp} : \text{Skew}(n) \to SO(n)$ is surjective (do each block separately; use Euler's formula). This then gives you an obvious smooth path $\gamma : [0, 1] \to SO(n)$ between $I = \text{Exp}(0)$ and $R = \text{Exp}(B) \in SO(n)$. Details are below:
Let $\text{Exp}$ denote the matrix exponential. Note that $\text{Exp} \colon \text{Skew}(n) \to SO(n)$. I claim that $\text{Exp} : \text{Skew}(n) \to SO(n)$ is surjective. Let $A \in SO(n)$. It suffices to show that each block of $A$, as described above, is the exponential of some skew symmetric matrix. Euler's formula shows you how to get the two by two blocks as exponentials of $i\theta$, which is skew symmetric. The fact that $\det(A) > 0$ implies that $q$ is even. Since $q$ is even, the $-I_q$ block can be split into two by two blocks with $c = -1, s = 0$, so Euler's formula works here too. The $I_p$ block is just $\text{Exp}(0)$. Putting these blocks together gives us $B \in \text{Skew}(n)$ with $\text{Exp}(B) = A$.
Now set $\gamma(t) = \text{Exp}(tB)$ for $t \in [0, 1]$. Since $tB \in \text{Skew}(n)$, it follows that $\gamma(t) \in SO(n)$, so $\gamma : [0, 1] \to SO(n)$. Since $\gamma(0) = I$, $\gamma(1) = A$, this gives the desired path.