Why is the determinant zero iff the column vectors are linearly dependent?

Solution 1:

It's easy to understand if you understand linear transformations.

What are vectors?

This is how you would think of a vector right? One arrow pointing into space:

$\begin{bmatrix} x\\y\\z \end{bmatrix}$

But you can also describe them as one arrow for each dimension:

$x\begin{bmatrix} 1\\0\\0\end{bmatrix} + y\begin{bmatrix} 0\\1\\0 \end{bmatrix}+ z\begin{bmatrix} 0\\0\\1\end{bmatrix}$

Now here we used the unit vectors for each dimension as a base.

What's a linear transformation?

With that model of vectors it's easy to describe linear transformations. Let's look at a 3x3 matrix A. When you apply the transformation A to a vector v you get a new vector v': Av=v'. Usually you learn to compute it by multiplying components of each row with the components of the column-vector and adding them:

$\begin{bmatrix}a&b&c\\d&e&f\\g&h&i\end{bmatrix} \begin{bmatrix}x\\y\\z \end{bmatrix} = \begin{bmatrix}ax+by+cz\\dx+ey+fz\\gx+hy+iz\end{bmatrix}$

But if you look at that new vector on the right you see that it's the same as

$x\begin{bmatrix} a\\d\\g\end{bmatrix} + y\begin{bmatrix} b\\e\\h \end{bmatrix}+ z\begin{bmatrix} c\\f\\i\end{bmatrix}$

Does this look familiar? Yes the matrix is just a base of 3 column vectors, with which you can map all vectors to a set of new vectors, linearly, by multiplying the components with the vectors replacing the original unit vectors. If it looks like this:

$\begin{bmatrix}1&0&0\\0&1&0\\0&0&1\end{bmatrix}$

,which is the Identity matrix, then of course nothing will happen. You just get the same vectors, because the new base is the same as the original unit vector base.

What's the meaning of the determinant?

You probably know how to calculate the determinant, or you can find it easily on the internet anyway. But what does it describe? In 2-dimensional space, two 2-dimensional vectors could describe a parallelogram. in 3-dimensional space, three 3-dimensional vectors could describe a parallelepiped. Those have an Area A or a Volume V respectively. Now the determinant of a transformation matrix describes how much the Area or Volume will get scaled by applying the transformation onto such a set of vectors.

What does a determinant of 0 mean?

If you read the last paragraph you could probably deduce: a determinant of zero means the Volume or Area becomes 0. When does that happen? Of course only when at least one dimension is lost!

When is that dimension lost?

Originally we have The Identity matrix $\begin{bmatrix} 1&0\\0&1 \end{bmatrix}$, which consists of 2 independent vectors, spanning a plane on which all kinds of parallelograms are located. There are many other pairs of vectors that can do the same thing. What happens when 2 column vectors are dependent? They will show at the same direction and therefore don't span a plane anymore, only a line!

Mathematically: If you apply that kind of matrix as a transformation to 2 vectors, you hopefully remember, that you simply get a new vector by adding the products of your vector components with each base vector. But since each base vector is just a scaled version of another one, so their sum will also be a scaled version of that vector.

Do you understand now? If you have 2 vectors, which describe a parallelogram with an area A and apply a transformation with a matrix with 2 dependent column vectors, you will end up with 2 vectors on the same line, meaning the new "parallelogram" has an area of 0. You lost a dimension: The determinant is 0.

Solution 2:

The reason is that a matrix whose column vectors are linearly dependent will have a zero row show up in its reduced row echelon form, which means that a parameter in the system can be of any value you like. The system has infinitely many solutions. Also recall in reduced row echelon form the diagonal elements will be 1's excluding the row of zeros. Finally, the determinant of a upper triangular matrix is the product of the diagonal elements, therefore the determinant will be zero. It would look something like $$ A = \begin{pmatrix} 1 & a & b \\ 0 & 1 & c\\ 0& 0 & 0 \end{pmatrix}. $$ $Det(A) = 1\times1\times0 = 0$.

Solution 3:

Do you know that adding a multiple of one column to another column does not change the determinant? Do you see that if the columns are linearly dependent, then there is a way of adding multiples of columns to other columns so that one column becomes all zeros? Do you know what you can say about the determinant of a matrix that has an all-zero column?