I was wondering if someone could help me clarify something regarding the effect of swapping two rows on the sign of the determinant. I know that if $A$ is an $n\times n$ matrix and $B$ is an $n\times n$ matrix obtained from $A$ by swapping two rows, then

$$\det(B)=-\det(A)$$

but I don't know how to prove this.

I have been looking for proofs at internet, and read in both in textbooks and lectures notes that are available that this result is very hard to prove and most approaches rely on induction and so was wondering if there is something wrong with using that $\det(AB)=\det(A)\det(B)$ and then writing $B=EA$ where $E$ is an elementary matrix swapping two rows and using this result to get $\det(B)=\det(E)\det(A)=-\det(A)$ (since showing that $\det(E)=-1$ in this case is not that hard).


Yes, your method would work, and it is probably the most elegant possible.

We can without loss of generality assume that $E$ interchanges the first two rows. This means that we can write $E$ in block-diagonal form: $$ \left( \begin{array}[ccccc] 00 & 1 & 0 &\dots & 0 \\ 1 & 0 & 0 &\dots & 0 \\ 0 & 0 & 1 &\dots & 0 \\ ... & ... & ... & 1 & ... \\ 0 & 0 & 0 & ... & 1 \end{array}\right) $$

Now if you know how to calculate the determinant from the usual Laplace algorithm, starting at the bottom line, you see that the only nonzero terms are...

Also, why can we assume it interchanges the first two lines without loss of generality? (Think of what happens if we change a basis...)


The problem with your approach is that generally in order to prove that $\det(AB) = \det(A) \det(B)$, one uses the fact that swapping two rows of a matrix multiplies the determinant by $-1$ (see, for example, the second proof in http://www.proofwiki.org/wiki/Determinant_of_Matrix_Product).

So you are kind of stuck proving the desired result directly. There are several equivalent definitions of the determinant, and depending on which one you use, the proof looks a bit different. But I would agree with @Arthur that for at least some definitions, a direct proof is pretty straight-forward as long as you're comfortable with mathematical induction.


If you have a matrix $\mathrm A \in \mathbb{R}^{n \times n}$ and you swap its $i$-th and $j$-th rows, you are left-multiplying $\mathrm A$ by the permutation matrix

$$\mathrm E := \mathrm I_n - \mathrm{e}_i \mathrm{e}_i^\top - \mathrm{e}_j \mathrm{e}_j^\top + \mathrm{e}_i \mathrm{e}_j^\top + \mathrm{e}_j \mathrm{e}_i^\top = \mathrm I_n - (\mathrm{e}_j - \mathrm{e}_i) (\mathrm{e}_j - \mathrm{e}_i)^\top$$

where $\mathrm{e}_k$ is a vector with $n-1$ zeros and with a one on the $k$-th entry. Thus, the determinant of the new matrix is

$$\det (\mathrm E \mathrm A) = \det(\mathrm E) \cdot \det (\mathrm A)$$

Using the Weinstein-Aronszajn determinant identity,

$$\begin{array}{rl} \det (\mathrm E) &= \det \left( \mathrm I_n - (\mathrm{e}_j - \mathrm{e}_i) (\mathrm{e}_j - \mathrm{e}_i)^\top \right)\\ &= \det \left( 1 - (\mathrm{e}_j - \mathrm{e}_i)^\top (\mathrm{e}_j - \mathrm{e}_i) \right)\\ &= 1 - \left( 1 + (-1)^2 \right) = \color{blue}{-1}\end{array}$$

Hence, swapping two rows of $\mathrm A$ does change the sign of the determinant.


It is a chicken and an egg kind of a problem if you think about it that way. All of the following ideas are connected to each other;

1- Swapping any 2 rows of a matrix, flips the sign of its determinant.

2- The determinant of product of 2 matrices is equal to the product of the determinants of the same 2 matrices.

3- The matrix determinant is invariant to elementary row operations.

4- Multiplying an entire row (or column) of a matrix by a constant, scales the determinant up by that constant.

If you assume any subset of these, the rest follow through.

I have used the elementary row operations and multiplying the entire row by a constant to show that the proof is quite straightforward.

Swapping 2 rows inverts the sign of the determinant.

For any square matrix you can generalize the proof of swapping two rows (or columns) being equivalent to swapping the sign of the determinant by using the axiom that the determinant is invariant under elementary row (or column) operations.

Consider a Matrix $\mathrm{A} \in \mathbb{F}^{n\times n} $ as shown below;

$$ \mathrm{A}= \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{bmatrix} $$

Then the determinant of $\mathrm{A}$ is given by;

$$ \det{\mathrm{A}} = \begin{vmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} $$

Consider two separate rows that you want to swap and add one of them to the other. W.L.O.G here assume that the first row is added to the second row. This holds in general for any 2 rows (or columns).

$Row 2 := Row 2 + Row 1$

$$ \det{\mathrm{(A)}} = \begin{vmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21}+a_{11} & a_{22}+a_{12} & \cdots & a_{2n}+a_{1n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} $$

Next subtract the resulting row 2 from row 1.

$Row1 := Row1 - Row2$

$$ \det{\mathrm{(A)}} = \begin{vmatrix} -a_{21} & -a_{22} & \cdots & -a_{2n}\\ a_{21}+a_{11} & a_{22}+a_{12} & \cdots & a_{2n}+a_{1n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} $$

Add the resulting Row 1 to Row 2.

$Row2 := Row2 + Row1$

$$ \det{\mathrm{(A)}} = \begin{vmatrix} -a_{21} & -a_{22} & \cdots & -a_{2n}\\ a_{11} & a_{12} & \cdots & a_{1n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} $$

Take $-1$ common from the first row, this falls out of the determinant.

$$ \det{\mathrm{A}} = (-1) \begin{vmatrix} a_{21} & a_{22} & \cdots & a_{2n}\\ a_{11} & a_{12} & \cdots & a_{1n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} $$

Finally we have;

$$ \begin{vmatrix} a_{11} & a_{12} & \cdots & a_{1n}\\ a_{21} & a_{22} & \cdots & a_{2n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} = (-1) \begin{vmatrix} a_{21} & a_{22} & \cdots & a_{2n}\\ a_{11} & a_{12} & \cdots & a_{1n}\\ \vdots & \vdots & \ddots & \vdots\\ a_{n1} & a_{n2} & \cdots & a_{nn}\\ \end{vmatrix} \quad \blacksquare $$

The same argument holds for swapping any two rows (or columns).