Decompose invertible matrix $A$ as $A = LPU$. (Artin, Chapter 2, Exercise M.11)
Decompose matrix $A$ as $A = LPU$, where $A \in Gl_n( \mathbb{R}^n)$, $L$ is lower triangular, $U$ is upper triangular with diagonal elements of $1$, and $P$ is a permutation matrix.
It is fairly easy to decompose any invertible such $A$ as $PA = LU$ (1) (more or less Gaussian elimination with pivoting). But what is a constructive proof to get the above decomposition? The usual way I prove (1) is applying elementary matrices to the left of the pivoted $PA$, and this would (would it?) change $PA$, not only $A$ (if it only changed $A$, we could commute $P$).
This is an exercise in Artin, 2nd edition, 2nd chapter, M* exercises.
Thanks.
There's a variant of Gaussian elimination where you don't move the pivots until after you have cleared entries below the pivot. For example, in the $2 \times 2$ case, if you have $$A = \begin{pmatrix} 0 & 2 \\ 1 & 4 \end{pmatrix},$$ the first step in Gaussian elimination would be to swap the two rows. But in the variant, you only do 'L' moves first. In each column, the entry you choose to become the pivot is the same as in Gaussian elimination (the highest non-zero entry that does not occur in the row of a previous pivot), but you don't move it. You can still clear all the entries below a leading 1, so that after you finish the forward phase (minus row swapping), you can swap rows to get to row echelon form. This gives $PLA = U$, so multiplying on the left by $P^{-1}$, then $L^{-1}$ decomposes $A$ into $LPU$ form.
To illustrate with $A$, $$ \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ -2 & 1 \end{pmatrix} \begin{pmatrix} 0 & 2 \\ 1 & 4 \end{pmatrix} = \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix} $$ so $$ A = \begin{pmatrix} 1 & 0 \\ 2 & 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ 1 & 0 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & 2 \end{pmatrix}. $$
(Of course, in this example, there's a simpler $PU$ decomposition, but the general algorithm hinted at above works in general.)