Can a basis for a vector space be made up of matrices instead of vectors?
I'm sorry if this is a silly question. I'm new to the notion of bases and all the examples I've dealt with before have involved sets of vectors containing real numbers. This has led me to assume that bases, by definition, are made up of a number of $n$-tuples.
However, now I've been thinking about a basis for all $n\times n$ matrices and I keep coming back to the idea that the simplest basis would be $n^2$ matrices, each with a single $1$ in a unique position.
Is this a valid basis? Or should I be trying to get column vectors on their own somehow?
Elements of a basis of a vector space always have to be elements of the vector space in the first place. Hence, if you are looking for a basis of the space of all $n\times n$ matrices, then matrices actually are your vectors and the only choice for what a basis element can be. In fact, the matrices you describe are a valid basis for the space of all $n\times n$ matrices. However, looking at matrices this way (as vectors of the vector space of all $n\times n$ matrices), it might help to realize that they are just tuples with $n^2$ many entries, arranged as a square.
Yes, you are right. A vector space of matrices of size $n$ is actually, a vector space of dimension $n^2$. In fact, just to spice things up: The vector space of all
- diagonal,
- symmetric and
- triangular matrices of dimension $n\times n$
is actually a subspace of the space of matrices of that size.
As with all subspaces, you can take any linear combination and stay within the space. (Also, null matrix is in all the above three).
Try to calculate the basis for the above 3 special cases: For the diagonal matrix, the basis is a set of $n$ matrices such that the $i^{th}$ basis matrix has $1$ in the $(i,i)$ and $0$ everywhere else. Try to figure out the basis vectors/matrices for symmetric and triangular matrices.