Symmetry of Grassmanians

I thought this might be simple (now I'm not sure) but can't solve it: why is it true that for $X,Y$ two linear $n$-subspaces of $\mathbb{R}^{n+k}$ there exists an orthogonal transformation of $\mathbb{R}^{n+k}$ that takes $X$ to $Y$ and $Y$ to $X$?

(This is excercise 5-D in Milnor's Characteristic classes, so the statement should hold.)


First we will make a couple of reductions: Let $U=X\cap Y$ and $V=X\oplus Y$. Then $\dim X+ \dim Y=\dim (X\oplus Y)-\dim X\cap Y=\dim V-\dim U.$ Hence, we can write $X=\tilde{X}\oplus U$ and $Y=\tilde{Y}\oplus U$, where $\tilde{X}$ and $\tilde{Y}$ are othogonal to $U$. Finally, if we set $W$ to be the orthogonal complement of $X\oplus Y,$ then we have obtained and orthogonal decomposition $$ \mathbb{R}^{n+k}=U\oplus W\oplus (\tilde{X}\oplus\tilde{Y}). $$ So, if we can find an orthogonal transformation from $\tilde{X}\oplus\tilde{Y}$ to itself that exchanges $\tilde{X}$ and $\tilde{Y}$, then we can simply extend it to be the identity on $U$ and $W$ and we obtain a linear map with the required properties.

Now assume that $X\cap Y=0$, $X\oplus Y=\mathbb{R}^{2n}$ and $\dim X=\dim Y=n.$ Let $\{x_{1},\ldots,x_{n}\}$ and $\{y_{1},\ldots,y_{n}\}$ be two orthonormal basis of $X$ and $Y$, respectively, and let $$ A=(\langle x_{i},y_{j}\rangle) \qquad 1\leq i,j \leq n, $$ be the matrix of the inner products of the $x_{i}$'s with the $y_{j}$'s. Using the polar form of a matrix, we can write $A=KP$, where $K$ is orthogonal and $P$ is symmetric. Hence, using $K$ we can choose a new orthonormal basis $\{x_{1}',\ldots,x_{n}'\}$ of $X$ such that $$ P=(\langle x_{i}',y_{j}\rangle). $$ We now define $T:\mathbb{R}^{2n}\longrightarrow \mathbb{R}^{2n}$ by setting $$ Tx_{i}'=y_{i} \qquad \mbox{and} \qquad Ty_{i}=x_{i}'. $$ We claim that $T$ is an orthogonal transformation. Effectively, since $\{x_{1}',\ldots,x_{n}'\}$ and $\{y_{1},\ldots,y_{n}\}$ are orthonormal, it follows that $\langle Tx_{i}',Tx_{j}' \rangle=\langle y_{i},y_{j} \rangle=\delta_{ij}=\langle x_{i}',x_{j}' \rangle=\langle Ty_{i},Ty_{j} \rangle.$ Finally, $$ \langle Tx_{i}',Ty_{j}\rangle=\langle y_{i},x_{j}'\rangle=\langle x_{j}',y_{i}\rangle=\langle x_{i}',y_{j}\rangle, $$ where the last equality follows from the fact that $P$ is a symmetric matrix. Therefore, the map $T$ satisfies all the required properties.


Wow, this was harder than it looked!

I will show that there exist bases $\{x_1, \ldots, x_n\}$ of $X$ and $\{y_1, \ldots, y_n\}$ of $Y$ such that the subspaces $V_i := \mathrm{span}(x_i,y_i)$ form an orthogonal decomposition of $X+Y$. This is enough because then we can take an orthogonal transformation $f_i : V_i \to V_i$ such that $f_i(x_i) = y_i$ and $f(y_i) = x_i$, added them up to get an orthogonal transformation $X+Y \to X+Y$ that swaps $X$ and $Y$ and finally extend that to an orthogonal transformation of all of $\mathbb{R}^{n+k}$ arbitrarily.

Note: This orthogonal decomposition business amounts to requiring that for $i \neq j $, we have $x_i \perp x_j$, $x_i \perp y_j$, and $y_i \perp y_j$. In particular, we do not claim that $x_i$ is orthogonal to $y_i$, or indeed, even that $x_i \neq y_i$. When $X \cap Y \neq 0$, we could very well take $x_1 = y_1$, $x_2 = y_2$, ..., $x_r = y_r$ to be a basis of $X \cap Y$, for example.

OK, we'll prove the existence of those bases by induction on $n$. For $n=0$, everything is trivial. I claim the induction step boils down to proving that there exist non-zero vectors $x_1 \in X$, $y_1 \in Y$ such that $X \cap x_1^\perp \subseteq y_1^\perp$ and $Y \cap y_1^\perp \subseteq x_1^\perp$ (where $v^\perp$ denotes the orthgonal complement of the span of the vector $v$).

Indeed, if we can find such $x_1$ and $y_1$, setting $V_1 = \mathrm{span}(x_1,y_1)$, we will have $X \cap x_1^\perp = X \cap x_1^\perp \cap y_1^\perp = X \cap V_1^\perp$ and an analogous statement for $Y$. We can therefore apply the inductive hypothesis to the $(n-1)$-dimensional subspaces $X \cap x_1^\perp$ and $Y \cap y_1^\perp$ inside the space $V_1^\perp$.

Here's how you can pick $x_1$ and $y_1$: choose unit vectors in $X$ and $Y$ maximizing $|x_1 \cdot y_1|$ (the maximum is realized because the unit spheres in $X$ and $Y$ are compact). (Notice that if $X \cap Y \neq 0$, this maximum is $1$, attained for $x_1 = \pm y_1 \in X \cap Y$.)

To prove that $X \cap x_1^\perp \subseteq y_1^\perp$, let $x$ be a unit vector in $X$ such that $x \cdot x_1 = 0$. We know that for all $t \in \mathbb{R}$, $$ \frac{|(x_1 + tx) \cdot y_1|}{|x_1 + tx|} \le |x_1 \cdot y_1|. $$

Using that $|x_1 + tx| = \sqrt{1+t^2}$, squaring and rearranging, we get that for all real $t$, $\left( (x_1 \cdot y_1)^2 - (x \cdot y_1)^2 \right)t^2 - 2(x_1 \cdot y_1) (x \cdot y_1) t \ge 0$. This forces $x \cdot y_1 = 0$ as required.

(Alternatively, with $x_1, y_1, x$ as above reason geometrically as follows: $x_1$ is the unit vector in the 2-plane $X' := \mathrm{span}(x_1,x)$ that minimizes the angle between $x_1$ and $y$. It is a simple fact of 3-dimensional Euclidean geoemtry that then the plane $\mathrm{span}(x_1,y)$ meets $X'$ at a right dihedral angle; this implies that, since $x \perp x_1$, also $x \perp y_1$.)