What operations can I do to simplify calculations of determinant?

The determinant of an $n\times n$ matrix $A$ is a multilinear, antisymmetric function on the columns of $A$.

Let $$A = \begin{bmatrix} a_{11} & a_{12} & \cdots & a_{1n} \\ a_{21} & a_{22} & \cdots & a_{2n} \\ \vdots & \vdots & & \vdots \\ a_{n1} & a_{n2} & \cdots & a_{nn}\end{bmatrix}$$

Now let $\mathbf a_i$ be the $i^{th}$ column of $A$. Thus $\mathbf a_2 = \begin{bmatrix} a_{12} \\ a_{22} \\ \vdots \\ a_{n2}\end{bmatrix}$, for instance.

Then $\det(A) = \det(\mathbf a_1, \mathbf a_2, \cdots, \mathbf a_n)$.


Multilinearity

OK. Now what does multilinear mean? It means that $\det$ is linear in each of its arguments. Thus if you have something like $\det(3\mathbf a_1 + 4\mathbf b_1, \mathbf a_2, \cdots, \mathbf a_n)$, then this is exactly the same as $3\det(\mathbf a_1, \mathbf a_2, \cdots, \mathbf a_n) + 4\det(\mathbf b_1, \mathbf a_2, \cdots, \mathbf a_n)$. Also:

$$\begin{align}\det(\mathbf a_1, -\mathbf a_2 + 5\mathbf b_2, \mathbf a_3, 6\mathbf a_4 + 2\mathbf b_4) &= -\det(\mathbf a_1, \mathbf a_2, \mathbf a_3, 6\mathbf a_4 + 2\mathbf b_4) + 5\det(\mathbf a_1, \mathbf b_2, \mathbf a_3, 6\mathbf a_4 + 2\mathbf b_4) \\ &= -[6\det(\mathbf a_1, \mathbf a_2, \mathbf a_3, \mathbf a_4) + 2\det(\mathbf a_1, \mathbf a_2, \mathbf a_3, \mathbf b_4)] \\ &\ \ \ \ + 5[6\det(\mathbf a_1, \mathbf b_2, \mathbf a_3, \mathbf a_4) + 2\det(\mathbf a_1, \mathbf b_2, \mathbf a_3, \mathbf b_4)] \\ &= -6\det(\mathbf a_1, \mathbf a_2, \mathbf a_3, \mathbf a_4) - 2\det(\mathbf a_1, \mathbf a_2, \mathbf a_3, \mathbf b_4) \\ &\ \ \ \ + 30\det(\mathbf a_1, \mathbf b_2, \mathbf a_3, \mathbf a_4) + 10\det(\mathbf a_1, \mathbf b_2, \mathbf a_3, \mathbf b_4)\end{align}$$


Antisymmetry

The fact that $\det$ is antisymmetric means that if you swap any two arguments of $\det$, you'll negate the sign. Thus $$\det(\mathbf a_1, \mathbf a_2) = -\det(\mathbf a_2, \mathbf a_1)$$

Likewise $$\begin{align}\det(\mathbf a_1, \mathbf a_2, \mathbf a_3, \mathbf a_4) &= -\det(\mathbf a_1, \mathbf a_4, \mathbf a_3, \mathbf a_2) \\ &= \det(\mathbf a_4, \mathbf a_1, \mathbf a_3, \mathbf a_2)\end{align}$$

So swapping an even number of columns doesn't change the sign, but swapping an odd number will produce a negative sign.

One consequence of this antisymmetry is that if two columns are scalar multiples of each other, then the determinant must be zero. This is because after pulling the constant multiple out front the two columns will be the same. Then if you swap those two columns, the determinant will look the same -- but it'll have a minus sign out front. And thus the determinant will equal its negative. But the only number that equals its negative is zero. Thus the determinant is zero.

Thus $\det(\mathbf a_1, k\mathbf a_1) = 0$.


So knowing these properties of the determinant, what operations can we do on the columns of a matrix? We can do $3$ things:

  1. We can switch any two columns. This will have the effect of negating the determinant.
  2. We can pull any constant times a column out of the determinant: i.e. $\det(\mathbf a_1, 2\mathbf a_2) = 2\det(\mathbf a_1, \mathbf a_2)$.
  3. We can add a scalar multiple of one of the columns to another column. This will not change the value of the determinant at all. That is $\det(\mathbf a_1, \mathbf a_2) = \det(\mathbf a_1, \mathbf a_2 + k\mathbf a_1)$.

Now you should try to prove each of the above by using the properties of the determinant that I gave above (there's only two of them, so it shouldn't be too hard).


One last thing before moving on to an example: the determinant of the transpose of a matrix is equal to the determinant of the matrix. That is $\det(A^T) =\det(A)$. This implies that everything that we did with columns above, we could equally well have done to the rows of a matrix. This is provable by showing the $A$ and $A^T$ share the same characteristic polynomial.


Now we'll look at an example.

Let $$A = \begin{bmatrix} 2 & 2 & 3 \\ 1 & 3 & 5 \\ 1 & 4 & 7\end{bmatrix}$$

The way we'll solve this is basically the same way we'd do a Gaussian elimination problem -- except we'll be do column operations instead of row operations. We could do row operations, but I flipped a coin and it landed column-side up. :)

$$\begin{align}\det(A) &= \det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 2 \\ 3 \\ 4\end{bmatrix}, \begin{bmatrix} 3 \\ 5 \\ 7\end{bmatrix} \right) \\ &= \det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 2 \\ 3 \\ 4\end{bmatrix} - \begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 3 \\ 5 \\ 7\end{bmatrix} - \frac 32\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}\right) \\ &= \det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix}, \begin{bmatrix} 0 \\ \frac 72 \\ \frac {11}2\end{bmatrix}\right) \\ &= \frac 12\det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix}, \begin{bmatrix} 0 \\ 7 \\ 11\end{bmatrix}\right) \\ &= \frac 12\det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix}, \begin{bmatrix} 0 \\ 7 \\ 11\end{bmatrix} - 3\begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix}\right) \\ &= \frac 12\det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 2\end{bmatrix}\right) \\ &= -\frac 12\det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 2\end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix}\right) \\ &= -\frac {1}{2}\det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 2\end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ 3\end{bmatrix} - 2\begin{bmatrix} 0 \\ 1 \\ 2\end{bmatrix}\right) \\ &= -\frac 12\det\left(\begin{bmatrix} 2 \\ 1 \\ 1\end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 2\end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ -1\end{bmatrix}\right)\end{align}$$

At this point we just have to remember that the determinant of a triangular matrix is the product of the diagonal elements. Thus $\det(A) = -\frac12(2)(1)(-1) = 1$.

Notice also that I did a couple of steps that I didn't really need to because I wanted to demonstrate all three operations to you.