Addition and Tensor product of Vector spaces for beginners : Concrete example

This is a response to the OP's comment, which is too long for a comment.

Conventions

The two objects denoted by $\oplus$ are sometimes called the "internal" and "external" direct sums. External direct sums always make sense for any two vector spaces, and elements are literally ordered pairs whose elements come from the summands. Internal direct sums require that the summands both live in a common ambient vector space, and their elements are literally vectors in that ambient space. In case the internal sum makes sense, one can prove that the map $V\oplus_{ext} W \to V\oplus_{int} W$ sending $(v,w)$ to $v+w$ is an isomorphism (and is the "best" kind of isomorphism in any sense you might mean that, e.g. functorial), so they are for all intents and purposes the same object.

In vector space decompositions such as yours, it is extremely common to use the symbol "=" to mean "isomorphic (in the best needed way)". This abuse of notation is very well-justified in practice, e.g. I may want to construct the tensor product as a set of matrices instead of writing down an abstract basis as you've done, and it's silly to let this "linguistic" difference get in the way.

But for the purposes of this question, it's clear that you mean we should both agree that $V\otimes W$ means $\text{span}_{\Bbb R} \{v\otimes w:v\in V, w\in W\}$, and that you mean "=" to mean "literally equal as sets". In this case, we must use the internal direct sum, since the left-hand side is not constructed set-theoretically as a direct sum (unless we have a very strange construction of the external direct sum).

Since you have constructed the ambient vector space $V_{AB}$, in which both $V_I$ and $V_{I\!I}$ live, this is not a problem. We simply need to find two subspaces of $V_{AB}$ with trivial intersection that span the space.


Construction

Literally speaking, $$ V_{AB} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) + b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes \begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a,b\in \Bbb{R}\right\}$$

Thus, one possible choice for $V_I$ and $V_{I\!I}$ would be $$ V_{I} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) : a\in \Bbb{R}\right\}$$ $$ V_{I\!I} = \left\{ b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : b\in \Bbb{R}\right\}.$$

The natural bases for these spaces are the obvious ones: just remove the remove the coefficients.

This is of course not the only choice*, but to address the other question in your comment, it is not even necessary that $V_I$ has dimension 1. It could just as easily be the zero subspace or the full $V_{AB}$ (leaving $V_{I\!I}$ to be the other one). However, because this is "boring", it is sometimes called the trivial direct sum decomposition. So in that sense, the answer to your question is yes: in your example, all nontrivial decompositions will have both summands of dimension 1.

* I say "of course" in the sense that there is the usual freedom that one has in (direct) sum constructions. For instance, a different choice would be $V_{I\!I}$ as before, but $$ V_{I} = \left\{ 3a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) - 2a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a\in \Bbb{R}\right\},$$ and other such things.