Is there a name for a matrix where the column space equals the row space? Does it have any interesting properties?
Solution 1:
A geometric characterization of these matrices would be the following:
An $(n\times n)$-matrix $A$ is the same as a linear transformation $A:\ {\mathbb R}^n\to{\mathbb R}^n$. Provide ${\mathbb R}^n$ with the standard scalar product.
Claim: The condition ${\rm Col}\,A={\rm Row}\,A$ is equivalent to ${\rm ker}\,A=({\rm im}\, A)^\perp$. In particular any orthogonal projection would have this property.
Proof. The condition ${\rm Col}\,A={\rm Row}\,A$ means that ${\rm im}\,A={\rm im}\,A^t$. Take an $x\in{\ker}\,A$. Then $$0=\langle A x,y\rangle=\langle x,A^t y\rangle\qquad\forall y\ ;$$ therefore $x$ is orthogonal to ${\rm im}\,A^t={\rm im}\,A$. It follows that ${\rm ker}\,A\subset({\rm im}\,A)^\perp$, whence ${\rm ker}\,A=({\rm im}\, A)^\perp$ by counting dimensions.
Conversely, assume that ${\rm ker}\,A=({\rm im}\, A)^\perp$. Take an $x\in{\rm im}\,A^t$. Then $x=Az$ for a $z\in{\mathbb R}^n$, and we have $$\langle u,x\rangle=\langle u,A^t z\rangle=\langle Au,z\rangle =0\qquad\forall u\in{\rm ker}\,A\ .$$ It follows that ${\rm im}\,A^t\subset({\rm ker}\,A)^\perp={\rm im}\,A$ and therefore ${\rm im}\,A^t={\rm im}\,A$, as $A$ and $A^t$ have the same rank.
Solution 2:
Perhaps these should be called scaled $r$-dimensional cosets of the unitary group $U(n)$.
By theorem 14 on page 6 of Travis Schedler's lecture 18, every $A$, when viewed as a linear transformation, has a polar decomposition $A=S\sqrt{A^*A}$, where $S$ is an isometry (unitary over $\mathbb{C}$ or orthogonal over $\mathbb{R}$), $A^*A$ is positive-definite (for $A\ne0$) by a previous theorem, and its square root is defined as the unitary/orthogonal conjugate $U\Lambda'U^*$ of the diagonal matrix $\Lambda'=\sqrt{\Lambda}$ of nonnegative square roots of its eigenvalues $(\Lambda_{ii}=\lambda_i)$ from its (necessarily diagonal) Schur decomposition $A^*A=U\Lambda U^*$. Similarly, $A^*=S\,'\sqrt{A^*A}$ for another isometry $S\,'$, since $(A^*A)^*=A^*A$. Thus $A^T=S\,'S^*A$, so I think your $B$ would have to also be an isometry, and over $\mathbb{R}$, special orthogonal (with determinant $1$). If we place the $\lambda_i$ in nonincreasing order (by permuting the columns of $U$), we can easily see that $\Lambda'$ lives in an embedding (injection) of $r$-dimensional diagonal matrices with strictly positive eigenvalues (scale matrices) from $\mathbb{R}^{r\times r}$ into $\mathbb{R}^{n\times n}$. Thus, $A=S\,U\Lambda'\,U^*$ is an isometry ($S$) or "permuted rotation" of an $r$-dimensional inhomogeneous dilation ($\Lambda'$) about the axes of some orthonomal basis ($U$) of $\mathbb{R}^n$, giving us a pretty good geometric picture of the matrix space $\{A\in\mathbb{R}^{n\times n}\mid A\mathbb{R}^n=A^T\mathbb{R}^n\}$.
Note, also, that unitary matrices (isometries) are of the form $U=e^{iH}$ for $H$ self-adjoint commuting with $U$. In fact, we can define a $C^\infty$ (exponential) map $U:\mathbb{R}\rightarrow\mathbb{R}^{n\times n}$ by $U(t)=e^{itH}$ with derivatives $U^{(n)}(t)=t\,U^*H^n$. The space (a Lie algebra) of self-adjoint matrices $H$ (or anti-self-adjoint if we drop the $i$ above), therefore, gives us the tangent space at each point of the space $U(n)$ of unitary matrices, which is a Lie Group.
I have not found a better necessary condition than Chrisitan's, ${\rm ker}\,A=({\rm im}\, A)^\perp$, or those given in the OP, e.g., $\exists B:A^TB=A$, for $A\mathbb{F}^n=A^T\mathbb{F}^n$ to hold, i.e. for an $n\times n$ matrix $A$ over a field $\mathbb{F}$ to share the same image or column space as its transpose $A^T$ (or conjugate transpose $A^*$). The best way to characterize these matrices is therefore probably with a list of equivalent conditions.
A sufficient condition is that the matrix $A\in\mathbb{F}^{n \times n}$ over the field $\mathbb{F}$ (e.g. $\mathbb{F}=\mathbb{R} \text{ or }\mathbb{C}$) is diagonalizable, meaning (TFAE):
- $\exists\,G:A=G\Lambda G^{-1}$ with $G$ invertible and $\Lambda$ diagonal in $\mathbb{F}^{n\times n}$ (spectral/eigen- decomposition)
- the geometric multiplicity of each eigenvalue equals its algebraic multiplicity
- the sum of the dimensions of its eigenspaces is equal to $n$
- there exists a basis of $\mathbb{F}^n$ consisting of eigenvectors of $A$
- the number of linearly independent eigenvectors for each eigenvalue $\lambda$ equals the algebraic degree of $(x-\lambda)$ as a factor of the characteristic polynomial of $A$
- its minimal polynomial is a product of distinct linear factors (splits and is squarefree) over $\mathbb{F}$
- ($\mathbb{F}=\mathbb{R}$): each Jordan block in the real Jordan decomposition of $A$ contains no $2\times2$ identity matrix superdiagonal blocks
- $\exists\,U,T:A=UTU^{-1}$ has a "complete"/complex Schur decomposition over $F$ with $U$ unitary and $T$ upper triangular, which is only true if all eigenvalues are in $F$ (and always true for $F=\mathbb{C}$)
- ($\mathbb{F}=\mathbb{R}$): the Schur decomposition over the reals $A=QSQ^{-1}$ with $Q$ orthogonal and $S$ block upper triangular has only $1\times 1$ (and no $2\times 2$) diagonal blocks within $S$
A stronger sufficient condition (implying the above) is that $A$ is a normal matrix, meaning (TFAE):
- $AA^*=A^*A$ ($A$ commutes with its conjugate transpose $A^*$)
- $A$ is unitarily diagonalizable ($A=U\Lambda U^{-1}$ for $U^*=U^{-1}$)
- $\exists U:A^*=AU$.
which (linguistically at least) completes the analogy with normal subgroups in group theory, for which left and right cosets are the same.
In these cases, the geometric characterization would be that $A$ is an anisotropic scaling or inhomogeneous dilation (which necessarily includes projection when $\text{rank}\;A\lt n$) after an orthogonal (or unitary for $\mathbb{F}=\mathbb{C}$) change of basis (with no isometry $S$). In this way, it is analogous to a self-adjoint operator.
When $A$ is positive definite, its Schur, spectral, and singular value decompositions all coincide. When $A$ is normal, its polar factors commute: $A=UP=PU$ (for $P$ positive semi-definite). In general there exists a polar factorization $A=UP$, but $U$ and $P$ are not guaranteed to commute.
The list of equivalences of normal (and diagonalizable) matrices is quite long, surely longer than in the wikipedia link, so there are probably many applications. One nice such application is in the classification of quadratic forms. For the dimensions of subspaces of rank $k$, have a look at Grassmanian and Stieffel manifolds.
An earlier version of this post had errors, based on a misunderstanding of this (in fact, an upper triangular, rather than block upper triangular, Schur form only exists for normal matrices whose characteristic or minimal polynomials have all their roots in $\mathbb{F}$).