Condition for a tensor to be decomposable

Let $V$ be a vector space of dimension 3 with basis $e_1,e_2,e_3$. Let $W$ be a vector space of dimension 2 with basis $f_1,f_2$. Is $e_1\otimes f_1+e_2\otimes f_2$ decomposable? What about $e_1\otimes f_2+e_1\otimes f_3-e_2\otimes f_2-e_2\otimes f_3$? What is the condition for a tensor $$ v=\sum_{i=1}^3 \sum_{j=1}^2 a_{i,j} e_i \otimes f_j $$ to be decomposable, that is, has the form $v\otimes w$ for $v\in V, w\in W$. As a related question, if I'm given any two vector spaces with dimensions $m,n$ and given bases, what is the condition for a tensor in $V\otimes W$ to be decomposable?

I think this has something to do with the linear independence but I'm not very comfortable with the tensor product, so I'm not really sure where to begin.


$\newcommand\P{\mathbb{P}}$Let $V$ and $W$ be complex vector spaces, and let $\P(V)$, $\P(W)$ and $\P(V\otimes W)$ be the projective spaces attached to $V$, $W$ and $V\otimes W$, respectively. If $v\in V$ is non-zero, I'll denote by $[v]$ the point of $\P(V)$ corresponding to it; it is the equivalence class of $v$ in $V\setminus0$ for the equivalence relation of linear dependence.

Since decomposability of a tensor does not change when we multiply it by a non-zero scalar, we can talk about the indecomposable elements of $\P(V\otimes W)$. Your question is therefore more or less equivalent to

how can we describe the set of indecomposable elements of $\P(V\otimes W)$?

Now, there is a map $f:\P(V)\times\P(W)\to\P(V\otimes W)$ which maps $([v],[w])$ to $[v\otimes w]$. This is a map of projective varieties (in the sense of algebraic geometry) and its image is precisely the set of indecomposable tensors. The image is in fact a subvariety of $\P(V\otimes W)$, which means that it is the common zero set of a finite set of polynomials. Finding these polynomials is a classical problem solved long ago; see Segre embedding for more information (most introductions to algebraic geometry will say something as well).

In the special case where $\dim V=3$ and $\dim W=2$, with bases $\{x_1,x_2,x_3\}$ and $\{y_1,y_2\}$, we want the coefficients of a tensor $$\sum_{\substack{1\leq i\leq 3\\1\leq j\leq2}}f_{i,j}x_i\otimes y_j$$ to be equal to a product $$\Bigl(\sum_{1\leq i\leq 3}v_ix_i\Bigr)\otimes\Bigl(\sum_{1\leq j\leq 2}w_iy_i\Bigr).$$

It is easy to see that we must have $$f_{i,j}f_{k,l}=f_{k,l}f_{i,l}$$ for all $i,k\in\{1,2,3\}$ and all $j,l\in\{1,2\}$ for that to happen, and some work will show that these conditions are in fact sufficient. We can express all these conditions by saying that the matrix $$\begin{pmatrix}f_{1,1}&f_{1,2}\\f_{2,1}&f_{2,2}\\f_{3,1}&f_{3,2}\end{pmatrix}$$ has rank $1$. Proving this is «just» linear algebra.

The answer in the general case where the dimensions are arbitrary is of the same spirit.

N.B.: it is interesting to know that the question «which tensors have rank $k$?» when $k\geq2$ and there are more than two factors is much, much harder, and very important—I think this is unsolved in general. Someone who knows algebraic geometry might be able to tell us.