Linear transformation maps the first quadrant to a closed set

My question is

Let $A: \mathbb{R}^n \to \mathbb{R}^m$ be a linear transformation, how can one prove that $\{Ay|y\geq 0\}$ is a closed set?

Here $y\geq 0$ means each component of $y$ $\geq 0$.

I can simplify the question in two ways

  1. Since the image of $A$ is a closed linear subspace of $\mathbb{R}^m$, by replacing $\mathbb{R}^m$ with this linear subspace, we may assume A is surjective.

  2. After a change of basis of the codomain and a reordering of the basis of the domain, we may assume $A$ is of the form $[I|B]$

Any help or hints are appreciated, thank you.


Ideas:

Let $v_i=Ae_i$, then my original statement is equivalent to saying the set $\{\sum a_iv_i |a_i\geq 0\}$ is a closed set. So I should now prove the following:

For any $v_1,...,v_k \in \mathbb{R}^n$, the set $S=\{\sum a_iv_i |a_i\geq 0\}$ is a closed subset of $\mathbb{R}^n$.


I suspect that the set $S$ is an intersection of half spaces, from this the closedness will be obvious.

Equivalently, I suspect that for each $x\notin S$, there exists a half space $H$ (cut by a codimension 1 linear subspace)s.t. $S\subset H$.

Is this statement true? I guess this should be true after drawing some examples.


New ideas: I am proving that the set $\{\sum a_iv_i|a_i\geq 0 \}$ is closed, and this is easy if all the $v_i$'s are indeed independent, so I would want to prove the following:

Let $S=\{$linearly independent subsets of $\{v_i\}\}$, then $\{\sum a_iv_i|a_i\geq 0\}=\cup_{T\subset S}\{$non negative linear combinations of $T\}$

And this union is closed since it is a finite union of closed set.

Equivalently, let $x=\sum a_iv_i,a_i\geq 0$, I want to write $x=\sum b_ie_i$, where $b_i\geq 0$ and $\{e_i\}\subset \{v_i\}$ is linearly independent.

My proof is as follows: I can replace $\{v_i\}$ with a smaller subset of itself s.t. $x$ cannot be expressed by non negative linear combinations of any strictly smaller subset of it. Now I claim that this set is linearly independent.

Suppose not, then we can write $c_1v_1+c_2v_2+\cdots + c_nv_n=0$, with some $c_i$'s are positive(since we can multiply the whole expression by $-1$) and I assume that $c_i>0$ for $i=1,2,\dots ,k$ and $c_i \leq 0$ for $i>k$

Now I assume $b=a_1/c_1=\min _{i=1,2,\dots ,k}{a_i/c_i}$, Then I have $$x=\sum a_iv_i-b(\sum c_iv_i)= (a_2-bc_2)v_2 +(a_3-bc_3)+\cdots + (a_n-bc_n)v_n$$ Which is a non negative linear combinations of fewer vectors, contradicts to my assumption and hence the minimal set $\{v_i\}$ is linearly independent.


Solution 1:

This is a surprisingly hard problem as you already observed. Let me add another proof, which at its core uses a similar minimality argument as yours.

Let $(x_k)$ in $S$ be given, such that $x_k\to x$.

The set $D_k:=\{ y\ge 0: \ Ay = x_k\}$ is non-empty, closed, and convex. Hence for each $k$ there is $y_k\in D_k$ satisfying $$ \|y_k\|_2 = \inf_{y\in D_k}\|y\|_2. $$ If $(y_k)$ contains a bounded subsequence, then we are done: $y_{k_n}\to y$ implies $y\ge0$, $Ay=x$, and $x\in S$. It remains to consider the case $\|y_k\|_2\to \infty$.

Denote $v_k:=\frac1{\|y_k\|_2}y_k$. By compactness, it contains a converging subsequence. W.l.o.g. let $(v_k)$ be converging to $v$ with $\|v\|_2=1$, $v\ge 0$. Moreover, it holds $Av =0$.

The idea is to prove that $y_k-v$ is in $D_k$. Obviously $A(y_k-v)=x_k$. Denote $I:=\{i: v_i>0\}$. Then $y_{k,i}=\|y_k\|_2\cdot v_{k,i}\to\infty$ for $i\in I$. And there is an index $K$ such that $y_{k,i}-v_i \ge0$ for all $i\in I$ and $k>K$. For $i\not\in I$, it holds $y_{k,i}-v_i =y_{k,i}\ge0$. Hence, $y_k-v\in D_k$ for all such $k>K$. Since $0\le y_k -v\le y_k$ and $v\ne 0$, we have $\|y_k-v\|< \|y_k\|_2$ due to the strict convexity of $\|\cdot\|_2$, a contradiction to the minimality of $y_k$. Thus the unbounded case does not happen.