A rank-one matrix is the product of two vectors

Hints:

$A=\mathbf v\mathbf w^T\implies\operatorname{rank}A=1$ should be pretty easy to prove directly. Multiply a vector in $\mathbb R^m$ by $A$ and see what you get.

For the other direction, think about what $A$ does to the basis vectors of $\mathbb R^m$ and what this means about the columns of $A$.


Solution

Suppose $A=\mathbf v\mathbf w^T$. If $\mathbf u\in\mathbb R^m$, then $A\mathbf u=\mathbf v\mathbf w^T\mathbf u=(\mathbf u\cdot\mathbf w)\mathbf v$. Thus, $A$ maps every vector in $\mathbb R^m$ to a scalar multiple of $\mathbf v$, hence $\operatorname{rank}A=\dim\operatorname{im}A=1$.

Now, assume $\operatorname{rank}A=1$. Then for all $\mathbf u\in\mathbb R^m$, $A\mathbf u=k\mathbf v$ for some fixed $\mathbf v\in\mathbb R^n$. In particular, this is true for the basis vectors of $\mathbb R^m$, so every column of $A$ is a multiple of $\mathbf v$. That is, $$ A=\pmatrix{w_1\mathbf v & w_2\mathbf v & \cdots & w_m\mathbf v}=\mathbf v\pmatrix{w_1&w_2&\cdots&w_m}=\mathbf v\mathbf w^T. $$


Suppose that $A$ has rank one. Then its image is one dimensional, so there is some nonzero $v$ that generates it. Moreover, for any other $w$, we can write $$Aw = \lambda(w)v$$

for some scalar $\lambda(w)$ that depends linearly on $w$ by virtue of $A$ being linear and $v$ being a basis of the image of $A$. This defines then a nonzero functional $\mathbb R^n \longrightarrow \mathbb R$ which must be given by taking dot product by some $w_0$; say $\lambda(w) =\langle w_0,w\rangle$. It follows then that

$$ A(w) = \langle w_0,w\rangle v$$ for every $w$, or, what is the same, that $A= vw_0^t$.


dealing with the more difficult direction - i will leave the details for you to complete, since this is a useful exercise to go through, in terms of familiarizing yourself with use of the index notation, and gaining practice in thinking at the required level of abstraction. but there may be a fairly intuitive and straightforward solution along the following lines:

suppose $n \ge 2$. let $A$ be an $n \times n$ matrix which acts on the $n-$ dimensional vector space $F^n$.

first you may show that if any $2\times 2$ minor of $A$ does not vanish then the image has dimension at least $2$. by permuting rows and columns you can see that this is implied by the case for $n=2$ which is straightforward.

second, knowing that all $2\times 2$ minors of a rank 1 matrix must vanish, you have the condition: $$ a_{ij}a_{kl}=a_{il}a_{kj} $$

suppose firstly that none of the entries in $A$ is zero. then you should have no trouble in deriving the desired conclusion.

we now need to show that the problem can, in effect, be reduced to the case already dealt with.

if $A \ne 0$ then we have an element, wlog $a_{11} \ne 0$. you may now show that if $a_{1j}=0$ that we must have $a_{kj}=0$ for all $k$.

the same thing follows for columns if you invoke the fact that the rank of a matrix is equal to the rank of its transpose.