Simple proof linearly dependent column imply linearly dependent rows

Solution 1:

I assume you want $A$ to be an $n \times n$ square matrix, otherwise the $1 \times 2$ matrix $A = \begin{bmatrix}1 & 1\end{bmatrix}$ is a counterexample.

(Side observation. The span of the columns need not be equal to the span of the rows - what you mean is that they have the same dimension.)

Assume there is is a non-zero $x$ such that $A x = 0$. We can assume WLOG that the first column of $A$ is a linear combination of the others, or more precisely $$ x = \begin{bmatrix} -1\\ z \end{bmatrix}, $$ where $z$ is a column vector of length $n-1$.

If $B$ is the $n \times (n-1)$ matrix obtained removing the first column from $A$, then $$\tag{prod} A = B Z, $$ where $Z$ is the $(n-1) \times n$ block matrix $$ Z = \begin{bmatrix} z & | &I \end{bmatrix}, $$ where $I$ is the identity $(n-1) \times (n-1)$ matrix.

Now (prod) shows the $n$ rows of $A$ are linear combinations of the rows of $Z$. Since $Z$ has $n - 1$ rows, this means that the rows of $A$ are linearly dependent, that is, there is a $y \ne 0$ such that $y^{t} A = 0$.

I am aware of the fact that you might consider this argument not to be satisfactory, because it reproduces part of the proof that the row rank equals the column rank.