Can elementary row operations be done by both left and right multiplication?

So I know that interchanging two rows of a matrix $A$ can be done by left-multiplying it by some permutation matrix $P$ to get $PA$. Similarly, interchanging two columns can be done by right-multiplication, $AP$.

My question is: Can you interchange rows by right-multiplication (of some matrix, not necessarily a permutation matrix)? And interchange columns by left-multiplication? In general, can every elementary row (or column) operation be represented as both a left- and right-multiplication?

Edit: if $A$ is invertible, and we want to find a matrix $B$ such that $AB = PA$, then we can solve for $B$ by multiplying both sides by $A^{-1}$ to get $B = A^{-1}PA$. So this is how we can interchange two rows by right-multiplication. Similarly for interchanging columns. But what if $A$ isn't invertible? Can we characterize the cases when it is or isn't possible?


If $A$ isn't invertible, it's not always possible.

For a simple counterexample, let $$ A = \pmatrix{ 1&0\\ 0&0\\ } $$ Then for any matrix $$ B = \pmatrix{ w&x\\ y&z\\ } $$ we get $$ AB = \pmatrix{ w&x\\ 0&0\\ } $$ which can't swap the rows of $A$.

So that answers the original question.

But there are some singular matrices $A$ for which row swaps via right multiplication are possible.

For example, if $n > 1$ and $A$ is an $n{\,\times\,}n$ matrix with all rows are equal, then for any $n{\,\times\,}n$ permutation matrix $P$, we have $PA=A$, hence using $B=I_n$, we get $PA=AB$.

For another example, if $$ A= \pmatrix{ 1&-1\\ -1&1\\ } $$ and $$ B= \pmatrix{ -1&0\\ 0&-1\\ } $$ then $AB=-A$ which is the same as $A$ with rows swapped.

As a partial converse, we have the following claim . . .

Claim:

If $K$ is a field and $A\in M_n(K)$ is such that

  • Some column of $A$ is non-constant and has nonzero sum.$\\[4pt]$
  • For all $n{\,\times\,}n$ permutation matrices $P$, there exists $B\in M_n(K)$ such that $PA=AB$.

then $A$ must be non-singular.

Proof:

Assume the hypothesis.

Then for some $j\in\{1,...,n\}$, the $j$-th column-vector $v_j$ of $A$ is non-constant and has nonzero sum.

It follows (see the lemma, proved at the end) that the span of the set of permutations of $v_j$ is all of $K^n$.

Thus, letting $e_j\in K^n$ be the $j$-th standard basis vector, there exist $n{\,\times\,}n$ permutation matrices $P_1,...,P_n$ such that the vectors $P_1Ae_j,...,P_nAe_j$ are linearly independent.

By hypothesis, there exist $B_1,...,B_n\in M_n(K)$ such that for all $i\in\{1,...,n\}$, we have $P_iA=AB_i$.

Then the equations \begin{align*} P_1Ae_j&=AB_1e_j\\[4pt] &\;\,\vdots\\[4pt] P_nAe_j&=AB_ne_j\\[4pt] \end{align*} must all hold, hence, since the vectors on the left are linearly independent, while the vectors on the right are in the image of $A$, it follows that the image of $A$ is all of $K^n$, so $A$ is non-singular, as was to be shown.

To tie up loose ends, we prove the following lemma . . .

Lemma:

Fix a positive integer $n > 1$, and let $X=K^n$ where $K$ is a field.

Let ${\large{\mathcal{P}}}$ be the set of $n{\,\times\,}n$ permutation matrices.

If $x\in X$ is non-constant and has nonzero sum, then the span of the set $\{Px{\,\mid\,}P\in {\large{\mathcal{P}}}\}$ is all of $X$.

Proof:

Suppose $x\in X$ is non-constant and has nonzero sum.

Let $V$ be the span of the set $\{Px{\,\mid\,}P\in {\large{\mathcal{P}}}\}$.

Suppose $V$ is a proper subspace of $X$.

Then there exists some nonzero vector $g\in X$ such that $g{\,\cdot\,}v=0$, for all $v\in V$.

In particular, we have $g{\,\cdot\,}Px=0$, for all $P\in {\large{\mathcal{P}}}$.

Then $g{\,\cdot\,}x=g{\,\cdot\,}I_nx=0$, hence, since $g$ is nonzero and $x$ has nonzero sum, it follows that $g$ is non-constant.

Thus, suppose $g=(g_1,...,g_n)$ with $g_i\ne g_j$.

Since $x$ is non-constant, there exists $y\in\{Px{\,\mid\,}P\in \mathcal{P}\}$ such that $y_i\ne y_j$

Let $T\in {\large{\mathcal{P}}}$ be the transposition that swaps the entries in positions $i,j$ but leaves all other entries fixed. \begin{align*} \text{Then}\;\;&g{\,\cdot\,}y=0\;\,\text{and}\;\,g{\,\cdot\,}Ty=0\\[4pt] \implies\;&g{\,\cdot\,}y-g{\,\cdot\,}Ty=0\\[4pt] \implies\;&(g_iy_i+g_jy_j)-(g_iy_j+g_jy_i)=0\\[4pt] \implies\;&(g_i-g_j)(y_i-y_j)=0\\[4pt] \end{align*} contradiction, since $g_i\ne g_j$ and $y_i\ne y_j$.

Hence we must have $V=X$, which completes the proof of the lemma.


Interesting question! There is a nice relation between your question and the area of mathematics called representation theory which I want to highlight. The short answer is:

  1. The only matrices $A$ for which all elementary columns operations on $A$ can be performed by left multiplication $PA$ are the full rank matrices and the zero matrix.
  2. When $\operatorname{char} \mathbb{F} = 0$, the only matrices $A \in M_{m \times n}(\mathbb{F})$ for which all permutation column operations can be performed by left multiplication $PA$ are the matrices $A$ for which $\ker(A) = \textrm{Span} \{ e_1 + \dots + e_n \}$ (this implies that the sum of elements in any row is zero) or $\ker(A) = \{ (x_1,\dots,x_n)^T \, | \, x_1 + \dots + x_n = 0 \}$ (this implies that all the columns of $A$ are identical) in addition to the previous ones.

For a "non-obvious at first look" example of $(2)$, the matrix $$ A = \begin{pmatrix} 1 & 2 & -3 \\ 0 & 1 & -1 \\ 1 & 1 & -2 \end{pmatrix} $$ has rank two but any column permutation operation on $A$ can be realized as left multiplication $PA$ (not necessarily by a permutation matrix $P$!).


For simplicity of notation, I'll go the other way and discuss which column operations can be implemented using left multiplication.

Let's start with the following question. We have a matrix $Q \in M_n(\mathbb{F})$ which might be a permutation matrix, a "add a multiple of a column to another column" or an arbitrary matrix. The first question you can ask is: What are the matrices $A \in M_{m \times n}(\mathbb{F})$ for which there exists $P \in M_m(\mathbb{F})$ such that $PA = AQ$. So for example, if $Q$ describes swapping two specific columns, you ask what are the matrices $A$ for which the swapping can be done by left multiplication. If $PA = AQ$ and $x \in \mathbb{F}^n$ such that $Ax = 0$ then we have $$ 0 = P(Ax) = (PA)x = (AQ)x = A(Qx). $$ This shows that $Q(\ker(A)) \subseteq \ker(A)$ so that $\ker(A)$ must be $Q$-invariant.

In fact, this is also a sufficient condition. Let's assume $\ker(A)$ is $Q$-invariant and write the columns of $A$ as $a_1,\dots,a_n$. They might be linearly dependent so choose a basis for the column space $a_{i_1},\dots,a_{i_k}$, define $Pa_{i_j} = AQe_j$ and extend $P$ arbitrary outside the image of $A$. Then by definition $(PA)e_{i_j} = Pa_{i_j} = AQe_{i_j}$. What about all other standard vectors? Let's say that $1 < i_1$ and so $a_1$ depends linearly on $a_{i_1},\dots,a_{i_k}$. Write $a_1 = x_1 a_{i_1} + \dots + x_k a_{i_k}$. Then

$$ (PA)e_1 = Pa_1 = P(x_1 a_{i_1} + \dots + x_k a_{i_k}) = x_1 Pa_{i_1} + \dots + x_k P a_{i_k} \\ = x_1 AQe_{i_1} + \dots + x_k AQe_{i_k}. $$

However, $x_1 e_{i_1} + \dots + x_k e_{i_k} - e_1 \in \ker(A)$ so that by assumption $$ A(Q(x_1 e_{i_1} + \dots + x_k e_{i_k} - e_1)) = x_1 AQe_{i_1} + \dots + x_k AQe_{i_k} - AQe_1 = 0 $$ so $$ (PA)e_1 = x_1 AQe_{i_1} + \dots + x_k AQe_{i_k} = AQe_1. $$

This gives us a concrete answer: Given $Q \in M_n(\mathbb{F})$, the matrices $A \in M_{m \times n}(\mathbb{F})$ for which there exists $P \in M_m(\mathbb{F})$ such that $PA = AQ$ are precisely the matrices for which $\ker(A)$ is $Q$-invariant. In particular, if $m = n$ and $A$ is invertible, this can always be done as you noticed.

Example: Let's say $$ Q = \begin{pmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{pmatrix} \in M_3(\mathbb{R}) $$ so that $AQ$ swaps the first two columns of $A$. The $Q$-invariant subapsces of $A$ are:

  1. $V_0 = \{ 0 \}$.
  2. $V_1 = \textrm{Span} \{ e_1, e_2 \}$
  3. $V_2 = \textrm{Span}\{ e_3 \}$.
  4. $V_3 = \textrm{Span}\{ e_1, e_2, e_3 \}$.

The $3 \times 3$ matrices we obtain are:

  1. Invertible matrices $A \in M_{3 \times 3}(\mathbb{R})$.
  2. Matrices $A \in M_{3 \times 3}(\mathbb{R})$ whose kernel is precisely $\textrm{Span} \{ e_1, e_2 \}$: $$ \begin{pmatrix} 0 & 0 & a \\ 0 & 0 & b \\ 0 & 0 & c \end{pmatrix}, \, c \neq 0 $$
  3. Matrices $A \in M_{3 \times 3}(\mathbb{R})$ whose kernel is precisely $\textrm{Span} \{ e_3 \}$: $$ \begin{pmatrix} a & b & 0 \\ c & d & 0 \\ e & f & 0 \end{pmatrix}, \, ad - bc \neq 0 \textrm{ or } cf - de \neq 0. $$ Note that for such matrices, you can't always get $AQ$ as $PA$ when $P$ is a permutation matrix but you can always find $P$ such that $PA = AQ$.
  4. The zero matrix.

In general, the invariant subspaces depend over which field you are working on. If you would have considered $Q$ above as a complex matrix, you would get more invariant subspaces (because $Q$ is diagonalizable over $\mathbb{C}$) and more matrices $A$.


Now, that we "answered" the question for one $Q$, let's ask what are the matrices $A$ for which all column permutations can be realized as left multiplication? This means we are looking for matrices $A$ such that for all $Q_{\sigma}$, the subspace $\ker(A)$ is $Q_{\sigma}$-invariant where $\sigma \in S_n$ is a permutation. In the language of representation theory, the group $S_n$ acts on $\mathbb{F}^n$ via $Q_{\sigma}$ and we are looking for matrices $A$ for which $\ker(A)$ is a sub-representation of $\mathbb{F}^n$. It is well known that in characteristic zero, there are only two non-trivial sub-representations given by $$ \{ (x_1, \dots, x_n)^T \, | \, x_1 + \dots + x_n = 0 \}, \, \operatorname{Span} \{ (1, \dots, 1) \}. $$ This gives us the result quoted in the beginning of the answer.

Finally, we can ask what are the matrices for $A$ which all elementary column operations can be realized by left multiplication. Since the elementary operations generate the group $GL_n(\mathbb{F})$, the question again translates to finding the subrepresentations of $GL_n(\mathbb{F})$ on $\mathbb{F}^n$. In this case, there are no non-trivial subrepresentations so the only matrices for which this is possible are the zero matrix and the matrices with full rank.


If you allow right multiplication with transposes then we have:

$$(A^TP^T)^T=PA$$

but as @quasi shows, it's otherwise impossible.