Is the rank of a matrix the same of its transpose? If yes, how can I prove it?
Solution 1:
The answer is yes. This statement often goes under the name "row rank equals column rank". Knowing that, it is easy to search the internet for proofs.
Also any reputable linear algebra text should prove this: it is indeed a rather important result.
Finally, since you said that you had only a substitute lecturer, I won't castigate him, but this would be a distressing lacuna of knowledge for someone who is a regular linear algebra lecturer.
Solution 2:
There are several simple proofs of this result. Unfortunately, most textbooks use a rather complicated approach using row reduced echelon forms. Please see some elegant proofs in the Wikipedia page (contributed by myself):
http://en.wikipedia.org/wiki/Rank_%28linear_algebra%29
or the page on rank factorization:
http://en.wikipedia.org/wiki/Rank_factorization
Another of my favorites is the following:
Define $\operatorname{rank}(A)$ to mean the column rank of A: $\operatorname{col rank}(A) = \dim \{Ax: x \in \mathbb{R}^n\}$. Let $A^{t}$ denote the transpose of A. First show that $A^{t}Ax = 0$ if and only if $Ax = 0$. This is standard linear algebra: one direction is trivial, the other follows from:
$$A^{t}Ax=0 \implies x^{t}A^{t}Ax=0 \implies (Ax)^{t}(Ax) = 0 \implies Ax = 0$$
Therefore, the columns of $A^{t}A$ satisfy the same linear relationships as the columns of $A$. It doesn't matter that they have different number of rows. They have the same number of columns and they have the same column rank. (This also follows from the rank+nullity theorem, if you have proved that independently (i.e. without assuming row rank = column rank)
Therefore, $\operatorname{col rank}(A) = \operatorname{col rank}(A^{t}A) \leq \operatorname{col rank}(A^{t})$. (This last inequality follows because each column of $A^{t}A$ is a linear combination of the columns of $A^{t}$. So, $\operatorname{col sp}(A^{t}A)$ is a subset of $\operatorname{col sp}(A^{t})$.) Now simply apply the argument to $A^{t}$ to get the reverse inequality, proving $\operatorname{col rank}(A) = \operatorname{col rank}(A^{t})$. Since $\operatorname{col rank}(A^{t})$ is the row rank of A, we are done.
Solution 3:
Since you talked about reduced row-echelon form, I assume you know what elementary row and column operations are. The basic fact concerning these operations is the following:
Elementary (row or column) operations change neither the row rank nor the column rank of a matrix.
Now, given a nonzero matrix $A$, try the following:
- Bring $A$ to its reduced row-echelon form $R$ using elementary row operations.
- Bring $R$ to its reduced column-echelon form $B$ using elementary column operations.
Then $B$ is of the form $$ \begin{pmatrix} 1&&&0&\ldots&0\\ &\ddots&&\vdots&&\vdots\\ &&1&0&\ldots&0\\ 0&\ldots&0&0&\ldots&0\\ \vdots&&\vdots&\vdots&&\vdots\\ 0&\ldots&0&0&\ldots&0\\ \end{pmatrix}. $$ Now it is obvious that the row rank of $B$ is equal to the column rank of $B$ (which is equal to the number of ones in the above "reduced row-and-column-echelon form"). Hence the row rank of $A$ is equal to the column rank of $A$, i.e. the row rank of $A$ is equal to the row rank of $A^T$.
Solution 4:
Yes, it is a fact. This is true over any commutative field. See for instance the first chapter of Emil Artin, Galois Theory for a very elementary argument.
If you need to phrase that argument in more conceptual terms, consider the matrices as linear transformations. If A is the matrix, then let $A^t$ be the transpose, and then $A^tA$ and $A$ have the same domain, and use the fact that they have the same null space, and use the dimension theorem rank + nullity = dimension of the space.
Your argument is true for real matrices only. For complex matrices they may not have the same null space.
Solution 5:
(1) If $f:V\to W$ is a linear map and $f^*:W^*\to V^*$ is its transpose, then we have a canonical isomorphism
$$\text{Im}(f^*)=\text{Im}(f)^*.$$
This can bee seen as follows:
(2) If $$ V\ \overset{p}{\twoheadrightarrow}A\ \overset{i}{\rightarrowtail}\ W\quad\text{and}\quad V\ \overset{q}{\twoheadrightarrow}B\ \overset{j}{\rightarrowtail}\ W $$ are two diagrams of linear maps such that
(a) $i$ and $j$ are injective, $p$ and $q$ are surjective,
(b) $i\circ p=j\circ q$,
then there is a unique linear map $\varphi:A\to B$ such that $\varphi\circ p=q$ and $j\circ\varphi=i$. Moreover $\varphi$ is bijective. The proof is easy.
To prove that (2) implies (1), note that the three diagrams
$$
V\ \overset{p}{\twoheadrightarrow}\ \text{Im}(f)\ \overset{i}{\rightarrowtail}\ W,
$$
$$
W^*\ \overset{i^*}{\twoheadrightarrow}\ \text{Im}(f)^*\ \overset{p^*}{\rightarrowtail}\ V^*,
$$
$$
W^*\ \overset{q}{\twoheadrightarrow}\ \text{Im}(f^*)\ \overset{j}{\rightarrowtail}\ V^*,
$$
where $p,i,q,j$ are the obvious maps, satisfy (a). As $p^*\circ i^*=f^*=j\circ q$, we see that (2) implies (1).
Assume the rank of $f:V\to W$ is infinite. The Erdős-Kaplansky Theorem implies then
$$\text{rank}(f^*)=|K|^{\text{rank}(f)},$$
where $K$ is the ground field and $|X|$ is the cardinality of $X$ for any set $X$.
More precisely, the Erdős-Kaplansky Theorem says $$ \dim(V^*)=|K|^{\dim(V)} $$ whenever $V$ is infinite dimensional, or, equivalently $$ \dim(K^S)=|K^S|, $$ where $S$ is an infinite set and $K^S$ is the set of families $(a_s)_{s\in S}$ with $a_s$ in $K$. In words:
The dimension of an infinite dimensional dual vector space is equal to its cardinality.
For a proof of the Erdős-Kaplansky Theorem, please see this answer.