This algorithm will find a basis for the span of some vectors. How/why does it work?
Say I want to find a basis for $span((1,2,5),(2,4,10),(-3,-5,-13),(2,1,4),(-4,-6,-16))$. Google tells me that to get the answer, I'm supposed to write down the vectors as columns of a matrix:
$$ \begin{pmatrix} 1 & 2 & -3 & 2 & -4 \\ 2 & 4 & -5 & 1 & -6 \\ 5 & 10 & -13 & 4 & -16 \\ \end{pmatrix} $$
... then bring said matrix to row echelon form...
$$ \begin{pmatrix} 1 & 2 & 0 & -7 & 2 \\ 0 & 0 & 1 & -3 & 2 \\ 0 & 0 & 0 & 0 & 0 \\ \end{pmatrix} $$
... then look at the pivots; in this case the first and third rows contain pivots, therefore the first and third columns of the original matrix are linearly independent while the others are linearly dependent, and therefore those two vectors are a basis of the span.
Okay, that's simple enough to remember and use in the exam but it still feels like magic to me. How does this work? Why do the pivots show which vectors are linearly independent?
The key observation is in Arthur’s comment: elementary row operations maintain linear dependency relations among the columns of a matrix. This is because performing an elementary row operation is equivalent to left-multiplying by an elementary matrix, which amounts to applying a change of basis to the column space.
Combine this with the fact that the row rank and column rank (that is, the dimensions of the row and column spaces) of a matrix are equal, and one can see how this method works: In row-reduced echelon form, each pivot column has a $1$ in a different row from other pivot columns and zeros elsewhere, so the set of pivot columns are obviously linearly independent. Since elementary row operations don’t affect linear dependency of columns, the corresponding columns of the original matrix are also linearly independent. There are as many pivots as the rank of the matrix, which is equal to the dimension of the column space, so this set of vectors forms a basis for the column space, i.e., a basis for your original set of vectors.