Does a "cubic" matrix exist?

Well, I've heard that a "cubic" matrix would exist and I thought: would it be like a magic cube? And more: does it even have a determinant - and other properties? I'm a young student, so... please don't get mad at me if I'm talking something stupid.

Thank you.

P.S. I'm 14 years old. I don't know that much about mathematics, but I swear I'll try to understand your answers. I just know the basics about PreCalculus.


If we're working with three-dimensional vectors, a matrix is a $3\times 3$ array of 9 numbers. If I'm understanding your question right, you're asking whether there is something like a $3\times 3\times 3$ array of 27 numbers with interesting properties.

Yes, there is such a thing; it is called a tensor. Tensors are a generalization of both vectors and matrices:

  • A number is a "rank-0 tensor".
  • A vector is a "rank-1 tensor"; it contains $D$ numbers when we're working in $D$ dimensions.
  • A matrix is a "rank-2 tensor", containing $D\times D$ numbers.
  • Your "cubic" thing is a "rank-3 tensor", containing $D\times D\times D$ numbers.

... and so forth.

One use for a rank-3 tensor is if you want to express a function that takes two vectors and produces a third vector, with the property that if you keep any one of the arguments constant, the output is a linear function of the other input. (That is, a bilinear mapping from two vectors to one). One familiar example of such a function is the cross product. In order to completely specify such a thing you need 27 numbers, namely the 3 coordinates of each of $f(e_1,e_1)$, $f(e_1,e_2)$, $f(e_1,e_3)$, $f(e_2,e_1)$, etc. Using linearity to the left and right, this is enough to determine the output for any two input vectors.

I haven't heard of any generalization of determinants to higher-rank tensors, but I cannot offhand think of a principled reason why one couldn't exist.

The study of tensors belongs in the field of multilinear algebra. It's quite possible to get at least an undergraduate degree in mathematics without ever hearing about them. If you take physics, you'll see lots and lots of them, though.


In addition to the canonical answer involving tensors and multilinear algebra, there is also an approach where the notion of determinant as a solution condition for a system of equations is generalized to some higher dimensional situations. The basic reference for this program (or one form of it) is the book by Gelfand, Kapranov and Zelevinsky of which the introduction and earlier chapters are relatively accessible:

http://books.google.com/books?id=2zgxQVU1hFAC


Matrices are like tables, with elements $A_{m,n}$, with operations of addition and multiplication $(A+B)_{mn} = A_{mn}+B_{mn}$ and $(A \cdot B)_{mn} = \sum_k A_{mk} B_{kn}$.

Cubic matrices have three indexes $A_{mnk}$, and $(A+B)_{mnk} = A_{mnk}+B_{mnk}$ and $(A \cdot B \cdot C)_{m n k} = \sum_{\ell} A_{m n \ell} B_{m \ell k} C_{\ell n k}$.

See arXiv:hep-th/0207054v3 for a flavor of applications.


The answer is yes. There are many places in mathematics where it would be useful to 'store' numbers/whatever in a 3-dimensional grid. That is not the problem. It's using them in a context and defining the right operations that make sense so you can combine things and do some abstract algebra.

For a specific example, start with something concrete. Consider linear transformations on the the plane, IE $\mathbb{R}^2$, using vectors $\imath = [1,0]$ and $\jmath = [0,1]$ A linear transformation from the plane to the plane can be represented by a 2 by 2 matrix. Once this is solidly understood, consider a function of two vector variables (again, to the plane), like $L(v_1,v_2) = w$ where $L$ is linear in both variables. This means that if you plug in a vector for either $v_1$ or $v_2$ you get a linear transformation (similarly to when you take the derivative of a function along one variable). One example might look like: $L([a_1,b_1],[a_2,b_2]) = (3a_1b_1 -5a_1b_2)[2,1] + b_2b_1[1,5]$

Now you have some coefficients involved:

$f(\imath,0) = a\imath + b\jmath$

$f(\jmath,0) = c\imath + d\jmath$

$f(0,\imath) = e\imath + f\jmath$

$f(0,\jmath) = g\imath + h\jmath$.

Notice you have eight numbers a through h here which complete describe $L$. Also, note you could arrange and label these coefficients more sensibly (how, and what are these numbers given this example?). Essentially the space of inputs is 4 dimensional, but you don't think of them as four in a row or column, but four arranged in a square. And then there is two choices for the coefficients on the output, the one for $\imath$ and the one for $\jmath$.

Now these eight numbers naturally fit in a cube, and they are essentially the matrix of $L$, called a tensor