Linear independence of vectors that form non-square matrix

We can easily find out if the vectors are dependent or not by finding determinant of matrices formed from the vectors. But if the matrix is not a square matrix, we cannot find the determinant. Then how do we know if the vectors are linearly independent?


There are a few ways to test if $m$ vectors in $\mathbb{R}^n$, where $m \le n$, are linearly independent. Here are the ones I can think of off the top of my head:

  1. Form an $n \times m$ matrix by placing the vectors as columns into a matrix, and row-reducing. The vectors are linearly independent if and only if the there is a pivot in each column of the row-echelon matrix formed. (This method is easy to verify, as it follows basically from the definition of linear independence.)

  2. Form an $m \times n$ matrix by placing the vectors as rows into the matrix, and row-reducing. The vectors are linearly independent if and only if the resulting row echelon form has no zero rows. (Each row operation doesn't change the rowspace, so a row of zeros corresponds to the original row space being of dimension smaller than $m$.)

  3. Form both the matrix $A$ from point 1, and $B$ from point 2. Notice that $A = B^T$. Then $BA$ is an $m \times m$ matrix. The vectors are linearly independent if and only if $\det(BA) \neq 0$. (The vectors are linearly independent if and only if $\operatorname{rank}(A) = m$, and $\operatorname{rank}(A) = \operatorname{rank}(AB)$.)