The rank of a linear transformation/matrix

First: linear transformation vs. matrix.

Matrices.

A matrix is a rectangular array, in the context of linear algebra the entries are always elements of the ground field (in your case, probably either the real numbers or the complex numbers). An $n\times m$ matrix has $n$ rows and $m$ columns.

If $A$ is an $n\times m$ matrix, the rank of $A$ is the dimension of the row-space (the subspace of $\mathbb{R}^m$, viewed as made up of row vectors, spanned by the rows), which equals the dimension of the column-space (the subspace of $\mathbb{R}^n$, viewed as being made up of column vectors, spanned by the columns).

The rank also equals the number of nonzero rows in the row echelon (or reduced row echelon) form of $A$, which is the same as the number of rows with leading $1$s in the reduced row echelon form, which is the same as the number of columns with leading $1$s in the reduced row echelon form.

The nullity of the matrix is the dimension of the subspace of $\mathbb{R}^m$ (viewed as column vectors) of all vectors $\mathbf{x}$ such that $A\mathbf{x}=\mathbf{0}$. This is equal to the number of parameters/degrees of freedom in the general solution to $A\mathbf{x}=\mathbf{0}$, and is equal to the number of columns that do not have leading $1$s in the reduced row-echelon form of $A$.

In particular, since every column in the reduced row echelon form of $A$ either has a leading $1$ or does not have a leading one, we conclude that $$\mathrm{rank}(A) + \mathrm{nullity}(A) = \text{number of columns of }A = m.$$

This is sometimes referred to as the Rank-Nullity Theorem in its matrix version.

Linear transformations.

Given two vector spaces $V$ and $W$ (over the same field), a linear transformation is a map $T\colon V\to W$ such that $T(\alpha\mathbf{v}+\mathbf{v}') = \alpha T(\mathbf{v})+T(\mathbf{v}')$ for all $\mathbf{v},\mathbf{v}'\in V$ and all scalars $\alpha$.

The image of $T$ is the subspace of $W$ given by $\{T(\mathbf{v})\mid \mathbf{v}\in V\}$. The kernel of $T$ is the subspace of $V$ $\{\mathbf{v}\in V\mid T(\mathbf{v}) = \mathbf{0}\}$. The rank of $T$ is the dimension of the image of $T$, $\mathrm{rank}(T) = \dim(\mathrm{Im}(T))$. The nullity of $T$ is the dimension of the kernel of $T$, $\mathrm{nullity}(T) = \dim(\mathrm{ker}(T))$. The Rank-Nullity Theorem in its version for linear transformations states that $$\mathrm{rank}(T) + \mathrm{nullity}(T) = \dim(V).$$

Connection between the two.

An $n\times m$ matrix $A$ can be used to define a linear transformation $L_A\colon\mathbb{R}^m\to\mathbb{R}^n$ given by $L_A(\mathbf{v}) = A\mathbf{v}$. If we do this, the kernel of $L_A$ equals the nullspace of $A$, and the image of $L_A$ equals the column-space of $A$. In particular, $\mathrm{rank}(A) = \mathrm{rank}(L_A)$, $\mathrm{nullity}(A)=\mathrm{nullity}(L_A)$.

Going the other way, given a linear transformation $T\colon V\to W$, if we pick a basis $\alpha$ for $V$ and a basis $\beta$ for $W$, then we can define a matrix called the coordinate matrix of $T$ with respect to $\alpha$ and $\beta$, $[T]_{\alpha}^{\beta}$, which is a $\dim(W)\times\dim(V)$ matrix whose columns are the coordinate vectors, relative to $\beta$, of the images under $T$ of the vectors in $\alpha$. The matrix has the property that for every $\mathbf{v}\in V$, $$[T]_{\alpha}^{\beta}[\mathbf{v}]_{\alpha} = [T(\mathbf{v})]_{\beta}$$ where $[\mathbf{v}]_{\alpha}$ is the coordinate vector of $\mathbf{v}$ with respect to the basis $\alpha$, and $[T(\mathbf{v})]_{\beta}$ is the coordinate vector of $T(\mathbf{v})$ with respect to $\beta$. If we do this, then the rank of $T$ equals the rank of $[T]_{\alpha}^{\beta}$, and the nullity of $T$ equals the nullity of $[T]_{\alpha}^{\beta}$.

So the notions of rank and nullity for matrices and for linear transformations correspond to one another under the correspondence between matrices and linear transformations.