Rows of orthogonal matrices are orthogonal?

How do I prove that rows of orthogonal matrices are also orthogonal?

By definition, orthogonal matrix means its inverse is equal to its transpose, but I don't see where the row orthogonality would come from. And also, if A has orthogonal rows, is it correct that the matrix A is also orthogonal?


Solution 1:

Let the rows of $\mathbf A$ be $\mathbf a_1, \ldots, \mathbf a_n$. Then the columns of $\mathbf A^T$ are $\mathbf a_1^T, \ldots, \mathbf a_n^T$. By definition the product $\mathbf P = \mathbf A \mathbf A^T = \mathbf A \mathbf A^{-1}$ is the unit matrix $\mathbf I = (\delta_{ij})$. But the entry $p_{ij}= \delta_{ij} $ of $\mathbf P$ is given as the matrix product $\mathbf a_i \mathbf a_j^T$, the latter being the same as the scalar product $\langle \mathbf a_i, \mathbf a_j \rangle$. This gives the desired orthogonality relations for the row vectors.

The converse is also true. If the rows of $\mathbf A$ are orthogonal, then the above considerations show that $\mathbf A \mathbf A^T = \mathbf I$. Hence $$\mathbf A^{-1} = \mathbf A^{-1} \mathbf I = \mathbf A^{-1} \mathbf A \mathbf A^T = \mathbf A^T .$$

Solution 2:

I guess you are asking to prove the following: Suppose $A$ is an $n \times n$ matrix with columns that are orthogonal and have unit norm. Then the rows of $A$ are also orthogonal and have unit norm.

To prove this, suppose the columns of $A$ are $a_1, \dots, a_n$. First note that $a_1, \dots, a_n$ are linearly independent because if $$\alpha_1 a_1 + \dots + \alpha_n a_n = 0$$ for some constants $\alpha_1, \dots, \alpha_n$, then taking the dot product on both sides above with respect to $a_i$, we would get $\alpha_i = 0$. By linear independence, we can write any vector $x \in \mathbb{R}^n$ as a linear combination of $a_1, \dots, a_n$: $$x = \beta_1 a_1 + \dots + \beta_n a_n.$$ Taking dot product on both sides with respect to $a_i$, we would get $\beta_i = \left<x, a_i\right>$. This gives the representation: $$x = \sum_{i=1}^n \left<x, a_i \right> a_i = \sum_{i=1}^n (a_i^T x) a_i = \left(\sum_{i=1}^n a_i a_i^T \right) x.$$ Because this is true for every $x$, we can conclude that $$\sum_{I=1}^n a_i a_i^T = I$$ where $I$ is the identity matrix. This is the same as $AA^T = I$. It is now easy to see that the $(i, j)^{th}$ entry of $AA^T$ is precisely the dot product between the $i^{th}$ and $j^{th}$ rows of $A$. Thus $AA^T = I$ implies that the rows of $A$ are orthogonal and have unit norm.