Are a square matrix's columns and rows either both(separately) linearly independent or both(separately) linearly dependent?

Here's an argument more-or-less from first principles.

If the rows of $A$ are linearly independent, then the result of doing row-reduction to $A$ is the identity matrix, so the only solution of $Av=0$ is $v=0$.

If the columns of $A$ are linearly dependent, say, $$a_1c_1+a_2c_2+\cdots+a_nc_n=0$$ where the $c_i$ are the columns and the $a_i$ are not all zero, then $Av=0$ where $$v=(a_1,a_2,\dots,a_n)\ne0$$

So, if the columns are dependent, then so are the rows.

Now apply the same argument to the transpose of $A$ to conclude that if the rows of $A$ are dependent then so are the columns.


A more intuitive argument (why for a square (nxn) matrix ,if it's rows are linearly dependent it implies that so are also it's columns)

Given that rows are linearly dependent,performing Gauss elimination must produce row of all zeroes (possibly more than one) .

Therefore the columns of the row reduced echelon form matrix are linearly dependent. That's so because all have zero in the same entry,thus have only n-1 "free" variables ,non-zero entries ,thus are n-1 dimensional,and n vectors in only n-1 dimensional space- can't be linearly independent.

And since columns of the reduced matrix are linearly depependent , so must be also the columns of the original matrix,which is our claim.

That's so because Gauss elimination preserves relation between columns! More specifically if some column of the original matrix is some linear combination of other columns, then the corresponding columns of the row reduced ,echelon form,matrix mirror this precisely. (otherwise it wouldn't be possible ,for example ,to find null space of the original matrix ,by examining the much simpler columns of it's row reduced echelon form)