LU decomposition; do permutation matrices commute?
Solution 1:
No, in general permutation matrices do not commute.
It seems like you are using the Doolittle algorithm, so I am going to assume that indeed you are.
Let $U_i$ be the $i$:th step in your LU decomposition of $A$, so $$\begin{align} U_0 &= A \\ U_1 &= L_1P_1U_0 \\ \vdots \\ U_n &= L_nP_nU_{n-1} \end{align}$$
This corresponds well to how one would do LU-decomposition by hand; get the largest element as the leading element on the row you are at (i.e. multiply with $P_k$), then eliminate that column on the following rows (i.e. multply with $L_k$).
As you remark, the $L_i$ will be atomic lower triangular matrices, the non-zero elements all being in column $i$. The inverse of $L_i$ can be constructed by negating all off-diagonal elements, see Wikipedia.
The permutation matrix $P_j$ will be a permutation matrix switching row $j$ with a row $k \geq j$, if multiplied on the left of the matrix you want to transform. The inverse to $P_j$ is $P_j$ itself (since $P_j$ switches row $j$ with row $k$, you can undo this by doing the same thing).
Consider the product $P_jL_i$ for $i < j$. $P_j$ will switch two rows of $L_i$, row $j$ and $k \geq j > i$. We switch elements in $L_i$ as follows: $$\begin{align} L_{j,i} &\leftrightarrow L_{k,i} \\ L_{j,j} &\leftrightarrow L_{k,j} \end{align}$$ Let $L_k'$ be the matrix produced by switching just the off-diagonal elements (the first switch above). Note that this is still an atomic lower triangular matrix. We can then produce $P_jL_i$ by just switching column $j$ with column $k$ in $L_k'$, which is multiplying by $P_j$ on the right. Here it is important that $i < j, k$, so column $i$ in $L_i'$ is not changed! In other words: $$P_j L_i = L_i' P_j$$
Thus, you can from your equation $$L_nP_nL_{n-1}...P_3L_2P_2L_1P_1A = U$$ produce $$L_n^S L_{n-1}^S \cdots L_1^S P_n P_{n-1} \cdots P_1A = U$$ which can be transformed to (note that $L_i^S$ is still atomic lower triangular): $$PA = LU$$ taking $$P = P_nP_{n-1} \cdots P_1$$ and $$L = (L_n^S L_{n-1}^S \cdots L_1^S)^{-1}.$$
Here, $L_i^S$ is the matrix made by taking $L_i$ and applying all $P_j$ (on the left) for $j > i$ on the off-diagonal elements.
You do this by doing the following, starting with $$L_nP_nL_{n-1}...P_3L_2P_2L_1P_1A = U$$ move the $P_2$ matrix to the right using $P_2L_1 = L_1' P_2$, producing: $$L_nP_nL_{n-1}...P_3L_2L_1'P_2P_1A = U$$ then do the same for the matrix $P_3$, but this matrix you have to move to the right twice, using $P_3L_2 = L_2'P_3$ and $P_3L_1' = L_1''P_3$: $$L_nP_nL_{n-1}...L_2'L_1''P_3P_2P_1A = U$$ and so on for every $P_j$.
As an example of $P_j L_i = L_i' P_j$ consider $$P = \begin{pmatrix} 1 & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \end{pmatrix}$$ $$L = \begin{pmatrix} 1 & 0 & 0 \\ 2 & 1 & 0 \\ 3 & 0 & 1 \end{pmatrix}$$ $$L' = \begin{pmatrix} 1 & 0 & 0 \\ 3 & 1 & 0 \\ 2 & 0 & 1 \end{pmatrix}$$
$$PL = \begin{pmatrix} 1 & 0 & 0 \\ 3 & 0 & 1 \\ 2 & 1 & 0 \end{pmatrix}$$ $$L'P = \begin{pmatrix}1 & 0 & 0 \\ 3 & 0 & 1 \\ 2 & 1 & 0 \end{pmatrix}$$
So, to summarize: These special matrices almost commute, only small changes are needed to swap the matrices. However, all important properties of the matrices are preserved are when swapping them.
Solution 2:
I found an answer based on matrix algebra rather than components here. Here's my attempt to reproduce it with some more detail.
Starting from $$L_3 P_3 L_2 P_2 L_1 P_1 A = U$$ Define $$\Lambda_1 = P_3 P_2 L_1 P_2^{-1} P_3^{-1}$$ $$\Lambda_2 = P_3 L_2 P_3^{-1}$$ $$\Lambda_3 = L_3$$
Calculating $$\Lambda_3 \Lambda_2 \Lambda_1 P_3 P_2 P_1 = (L_3)(P_3 L_2 P_3^{-1})(P_3 P_2 L_1 P_2^{-1} P_3^{-1}) P_3 P_2 P_1 = L_3 P_3 L_2 P_2 L_1 P_1$$ We can swap this into our starting formula to get $$\Lambda_3 \Lambda_2 \Lambda_1 P_3 P_2 P_1 A = U$$ And letting $\Lambda = \Lambda_1^{-1} \Lambda_2^{-1} \Lambda_3^{-1}$ and $P = P_3 P_2 P_1$ we have $P A = \Lambda U$.
Now it remains to show that the $\Lambda_i$ are lower triangular and to answer your question about commuting head on. Notice that the $P_i$ only affect rows $i$ and below because the elimination algorithm has completed its work with the rows above $i$. The inverses of the $P_i$ will therefore also only need to affect those rows to unscramble things. Examining $\Lambda_1 = P_3 P_2 L_1 P_2^{-1} P_3^{-1}$ we see that $\Lambda_1$ swaps around rows 2 and below, applies $L_1$ to add multiples of the first row to those lower rows, then unswaps the rows. This is equivalent to adding appropriate multiples of $L_1$ to the original, unswapped rows, which we know is an operation described by a lower triangular matrix. Therefore the matrix $\Lambda_1$ which represents the aggregate operation $P_3 P_2 L_1 P_2^{-1} P_3^{-1}$ must be lower triangular.
In summary, the matrices don't commute as you had asked in your original question, but we can change an $L_i$ into a related vector $\Lambda_i$ by left- and right-multiplying by some permutation matrices. Because the commuting fails, our formula for $\Lambda$ isn't particularly pretty: $$\Lambda = P_3 P_2 L_1^{-1} P_2^{-1} L_2^{-1} P_3^{-1} L_3^{-1}$$ However, we can get a neater conclusion if we return to $$L_3 P_3 L_2 P_2 L_1 P_1 A = U$$ and let $$M = L_3 P_3 L_2 P_2 L_1 P_1$$ so that $M A = U$. Then because $P A = \Lambda U$ we must have $P = \Lambda M$ or $\Lambda = P M^{-1}$. So we can calculate $\Lambda$ from our original $M$ just by applying the composition of our permutations to $M^{-1}$, which isn't quite as good as commuting but isn't half bad either.
Solution 3:
$\require{color}$
I found @Calle explanation interesting but have a hard time following the arguments. @kuzooroo explanation fits better my way of thinking but the arguments are too dense. I decided to start a new answer since my arguments will not fit in a comment slot.
Following @Calle: Note that, since each $P_i$ is a one-cycle permutation (one interchange), $P_i^{-1}=P_i$ since doing an interchange twice bring the matrix to the same original form. We now should prove that $\Lambda$ is a lower triangular matrix with ones in the diagonal.
$\Lambda_3$ is, of course, a lower triangular with 1 in the diagonal. To prove that $\Lambda_2$ is lower triangular with ones in the diagonal we take into account the fact that $L_2$ is atomic. That is,
\begin{eqnarray*} L_2 = \begin{pmatrix} 1 & 0 & \cdots & 0 & \cdots & \cdots &\cdots & \cdots & 0 \\ 0 & 1 & \cdots & 0 & \cdots & \cdots &\cdots & \cdots & 0 \\ \vdots & l_{32} & \ddots & \vdots & \vdots &\vdots & \vdots & \vdots & \vdots\\ \vdots & \vdots & \cdots & 1 & 0 & \cdots &\cdots& \cdots & \vdots \\ \vdots & l_{i2} & \cdots & 0 & 1 & 0 &\cdots & \cdots & \vdots \\ \vdots & \vdots & \cdots & \vdots & \ddots & \ddots & \ddots & \cdots & \vdots \\ \vdots & l_{k2} & \vdots & \vdots & \vdots & 0 & 1 & 0 & \vdots \\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots &\ddots & \vdots & 0 \\ 0 & 0 & \cdots & 0 & \cdots & \cdots &\cdots & 0 & 1 \end{pmatrix} \nonumber. \end{eqnarray*}
Let us assume that $P_3$ interchanges rows $i$ with $k$. (note that both $i$ and $k$ are larger than 2). We have that
\begin{eqnarray*} P_3 L_2 = \begin{pmatrix} 1 & 0 & \cdots & 0 & \cdots & \cdots &\cdots & \cdots & 0 \\ 0 & 1 & \cdots & 0 & \cdots & \cdots &\cdots & \cdots & 0 \\ \vdots & l_{32} & \ddots & \vdots & \vdots &\vdots & \vdots & \vdots & \vdots\\ \vdots & \vdots & \cdots & 1 & 0 & \cdots &\cdots& \cdots & \vdots \\ \vdots & \boxed{l_{k2}} & \vdots & \vdots & \textcolor{green}{0} & 0 & \textcolor{blue}{1} & 0 & \vdots \\ \vdots & \vdots & \cdots & \vdots & \ddots & \ddots & \ddots & \cdots & \vdots \\ \vdots & \boxed{l_{i2}} & \cdots & 0 & \textcolor{green}{1} & 0 & \textcolor{blue}{0} & \cdots & \vdots \\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots &\ddots & \vdots & 0 \\ 0 & 0 & \cdots & 0 & \cdots & \cdots &\cdots & 0 & 1 \end{pmatrix} \nonumber. \end{eqnarray*}
Finally $P_3 L_2 P_3^{-1}= P_3 L_2 P_3$ ineterchanges columns $i$ (green) with $k$ (blue) on $P_3 L_2$. The $1$ in column $k$ (blue) takes the place of the 0 in column $i$ (green) . In the same way the $1$ in green in row $k$ takes the position of the blue $0$ on the same row.
That is
\begin{eqnarray*} P_3 L_2 P_3^{-1} = \begin{pmatrix} 1 & 0 & \cdots & 0 & \cdots & \cdots &\cdots & \cdots & 0 \\ 0 & 1 & \cdots & 0 & \cdots & \cdots &\cdots & \cdots & 0 \\ \vdots & l_{32} & \ddots & \vdots & \vdots &\vdots & \vdots & \vdots & \vdots\\ \vdots & \vdots & \cdots & 1 & 0 & \cdots &\cdots& \cdots & \vdots \\ \vdots & \boxed{l_{k2}} & \vdots & \vdots & \textcolor{green}{1} & 0 & \textcolor{blue}{0} & 0 & \vdots \\ \vdots & \vdots & \cdots & \vdots & \ddots & \ddots & \ddots & \cdots & \vdots \\ \vdots & \boxed{l_{i2}} & \cdots & 0 & \textcolor{green}{0} & 0 & \textcolor{blue}{1} & \cdots & \vdots \\ \vdots & \vdots & \vdots & \vdots & \vdots & \vdots &\ddots & \vdots & 0 \\ 0 & 0 & \cdots & 0 & \cdots & \cdots &\cdots & 0 & 1 . \end{pmatrix} \nonumber. \end{eqnarray*}e
This is, again, an atomic matrix with the entries $l_{i2}$ and $l_{k2}$ interchanged. In the same way, $P_2 L_1 P_2^{-1}$ is atomic and so it is $\Lambda_1 = P_3 P_2 L_1 P_2^{-1} P_3^{-1}$,
In general for
\begin{eqnarray*} U= L_{n-1} P_{n-1} L_{n-2} P_{n-2} \cdots P_2 L_1 P_1 A, \end{eqnarray*} we define
\begin{eqnarray*} \Lambda_i = P_{n-1} \cdots P_{i+2} P_{i+1} L_i P_{i+1}^{-1} P_{i+2}^{-1} \cdots P_{n-1}^{-1}. \end{eqnarray*} where we assume $P_0=I$. The same procedure shown above guarantee that $\Lambda_i$ is atomic and so $\Lambda=\prod \Lambda_i$ is lower triangular with ones in the diagonal.
Hence this completes what is needed to verify that $PA=LU$.