mathematical difference between column vectors and row vectors

I'm writing a mathematical library; and I have an idea where I want to automatically turn column matrices and row matrices to vectors, with all of the mathematical properties of a vector.

Answer I'm looking for:

Someone with good mathematical reasoning explaining why:

column matrices, column vectors, row matrices, row vectors should not be treated as the same thing. (The library will ofcourse understand operations like [[1,2],[3,4]] * [1,2], where [1,2] is a vector)

or:

some kind of showcase or example where it is impossible for a library that can't differentiate between row vectors and column vectors to know which one of several possible answers are correct.

or:

some kind of evidence that it is in fact possible to do this.

please note: inner vector multiplication will be easily integrated by using a special function for that function rather than the * sign.


One "silly" example is the product of a column matrix times a row matrix. Consider: $$ \left[\begin{array}{c} 1 \\ 2 \\ 3 \end{array}\right] \left[\begin{array}{ccc} 4 & 5 & 6 \end{array}\right]$$. By the rules of matrix multiplication, we obtain the $3 \times 3$ matrix: $$ \left[\begin{array}{ccc} 4 & 5 & 6 \\ 8 & 10 & 12 \\ 12 & 15 & 18 \end{array}\right]$$ However, if I had "forgotten" than my original matrices were column and row matrices, respectively, then I might have considered them as vectors and (perhaps) computed the inner product: $$ (1, 2, 3) \cdot (4, 5, 6) = 32$$. By the way, if one works entirely in terms of matrices, and considers any vectors to be a column matrix, then the inner product can be defined by $\mathbf{v} \cdot \mathbf{w} = \mathbf{v}^T\mathbf{w}$, which is a standard practice in most linear algebra texts.

Hope this helps!


Don't expect to find any important mathematical distinction between them: these objects differ only at the level of notation and convention. They will form isomorphic (i.e. structurally equivalent) vector spaces.

There's quite a number of these things:

  • $n \times 1$ (column) matrices and column vectors of length $n$, e.g. \[\left(\begin{array}{c} 1 \\ 2 \\ 3 \\ \end{array}\right).\] A matrix uses two indices $A(i,j)$ say (where, in this case, index $j$ can only take on one value), whereas a column vector only has one (this distinction can matter e.g. in computer algebra systems).

  • $1 \times n$ (row) matrices and row vectors of length $n$, e.g. \[\left(\begin{array}{ccc} 1 & 2 & 3 \\ \end{array}\right).\] Same difference as with column matrices and column vectors.

  • 1-dimensional array of length $n$ (or a $k$-dimensional array where $k-1$ indices can take on only one value and one index can take on $n$ values).

  • Sequences of length $n$, ordered lists of length $n$, or ordered $n$-multisets, e.g. $(1,2,3)$.

  • Functions $f:\{1,2,\ldots,n\} \rightarrow S$, e.g. $f(x)=x$ and $n=3$.

  • Coefficients of polynomials of degree $n-1$ with a single indeterminate $x$, e.g. $1+2x+3x^2$.

The key ingredient in each case is that there is a 1-st element, a 2-nd element, up to n-th element. Individual definitions will have their own conventions (such as how matrix multiplication works), and will be easier to use in different contexts.