A direct computation is also fine: $$(PP^T)_{ij} = \sum_{k=1}^n P_{ik} P^T_{kj} = \sum_{k=1}^n P_{ik} P_{jk}$$ but $P_{ik}$ is usually 0, and so $P_{ik} P_{jk}$ is usually 0. The only time $P_{ik}$ is nonzero is when it is 1, but then there are no other $i' \neq i$ such that $P_{i'k}$ is nonzero ($i$ is the only row with a 1 in column $k$). In other words, $$\sum_{k=1}^n P_{ik} P_{jk} = \begin{cases} 1 & \text{if } i = j \\ 0 & \text{otherwise} \end{cases}$$ and this is exactly the formula for the entries of the identity matrix, so $$PP^T = I$$


Another way to prove it is to realize that any permutation matrix is the product of elementary permutations, where by elementary I mean a permutation that swaps two entries. Since in an identity matrix swapping $i$ with $j$ in a row is the same as swapping $j$ with $i$ in a column, such matrix is symmetric and it coincides with its inverse. Then, assuming $P=P_1\cdots P_k$, with $P_1,\ldots,P_k$ elementary, we have

$$ P^{-1} = (P_1\cdots P_k)^{-1}=P_k^{-1}\cdots P_1^{-1}=P_k\cdots P_1=P_k^t\cdots P_1^t = (P_1\cdots P_k)^t=P^t $$


Using a little knowledge about orthogonal matrices the following proof is pretty simple:

Since $v^tw=\sum_{k=0}^nv_iw_i$ if $v=(v_1,...,v_n),w=(w_1,...,w_n)$ we have $v^tv=1$ whenever v is a column of $P$. On the other hand $v^tw=0$ if $v$ and $w$ are two distinct columns of $P$. Therefore we can conclude that $(P^tP)_{i,j}=\delta_{i,j}$ and so $P^t=P^{-1}$.


Less sophisticated, you could just crunch it out.

First, a lemma:

The inverse of a matrix, if it exists, is unique.

Proof: If both $B$ and $C$ are inverse to $A$, then we have $B = BI = B(AC) = (BA)C = IC = C$ so $B = C$. (Here, $I$ denotes the identity matrix).

Using this, it follows in our specific case that in order to show $A^T = A^{-1}$, we need only show $A^TA = AA^T = I$.

Assume $i\neq j$. Then $(AA^T)_{ij} = \sum_k A_{ik}A^T_{kj} = \sum_k A_{ik}A_{jk}$. But for each $k$, $A_{ik}A_{jk} = 0$ since there is only one nonzero entry in the $k$th row and $i\neq j$ (so $A_{ik}$ and $A_{jk}$ can't both be the nonzero entry). So, $(AA^T)_{ij} = 0$ when $i\neq j$.

The argument that $(A^TA)_{ij} = 0$ when $i\neq j$ is almost identical, but uses the fact that the columns of $A$ contain only one nonzero entry.

Can you see what happens when, instead, $i = j$?


Let $π$ be a permutation on $n$ objects and

\begin{equation} \pi=\left(\begin{matrix} 1 & 2 &\ldots& n \\ \pi(1) & \pi(2) &\ldots& \pi(n) \end{matrix} \right) \end{equation}

Assume that $P_π$ be a permutation matrix. We need to prove that $P_π^T P_π=I$.

Note that, $π$ sends the $i$th row of the identity matrix to the $π(i)$th row, i.e.,

\begin{eqnarray*} P_\pi=[P_{ij}]=\left\{ \begin{array}{ll} 1; & i=\pi(j)\\ 0; & i \ne \pi(j). \end{array} \right. \end{eqnarray*}

The $ij$th component of $P_\pi^TP_\pi$ is

\begin{eqnarray} (P_\pi^TP_\pi)_{ij}&=&\sum_{k=1}^n P^T_{ik}P_{kj}\\ &=&\sum_{k=1}^n P_{ki}P_{kj}\\ &=& P_{\pi(j)i}P_{\pi(j)j}\\ &=& P_{\pi(j)i}=\left\{ \begin{array}{ll} 1; & i=j\\ 0; & i \ne j. \end{array} \right. \end{eqnarray}

Therefore, $P^T_\pi P_\pi=I$.