What, Exactly, Is a Tensor?

At the lowest level of understanding a tensor $T$ of rank $r$ is an $r$-dimensional array (think of a spreadsheet) whose "side-lengths" are all equal to a given $n\geq1$. Therefore $T$ has $n^r$ entries, which we assume to be real numbers in the following.

When we are setting up such a tensor we have some application in mind, say in geometry or physics. That's where the difficulties come in. The tensor is meant to be "applied" to one or several (variable) vectors, and the result will be a number or a vector of interest in the context at hand. E.g., the value $T(x,y)$ could be the scalar product of $x$ and $y$, or the area of the parallelogram spanned by $x$ and $y$, or the image of $x$ under $T$ when $T$ is considered as a linear map, or the retaliatory force felt when moving in direction $x$, and on and on. For the computation of actual values we need the coordinates of $x$ and $y$. Now these depend on the choice of basis in the ground space ${\mathbb R}^n$, and when we change the basis the coordinate values of the points $x$ change. But the scalar product or some induced force, being "well defined" geometric or physical quantities, should not change. This in turn implies that the entries in our tensor (spreadsheet) $T$ will have to change, albeit in a characteristic way, called "contravariant" or "covariant", depending on the case at hand.

(As an aside: When a certain Excel-spreadsheet is meant to be a price list for various fabrics, then its entries will change as well in a characteristic way when the currency or the units of measurement are changed.)

But we have a definite feeling that there is some hidden "robust identity" incorporated in $T$ that is independent of the more or less accidental values appearing in the spreadsheet. It is only in the second half of the last century that mathematics has found a universal (and abstract!) way to express and to deal with this "hidden identity" of $T$. The field of mathematics concerned with this is called multilinear algebra. Only in this realm it then makes sense to talk about the tensor product. But I won't go into this here.


A $k$-tensor is a multilinear function from $V\times V\times\dots\times V$ to the reals, where $V$ is a vector space and $k$ is the number of the $V$'s in the above Cartesian product. (Calculus on Manifolds, Michael Spivak, 1965, page 75).

This is the best definition I can find. I am with you, and thank you for asking the question, because I hate definitions that lack a noun. So and so is a __ (noun), please!

I don't know if a k-tensor is the most general type of tensor or not.


The simplest case is a tensor product of two vector spaces. If $V$ is a vector space with basis $\{v_i\}$ and $W$ is a vector space with basis $\{w_i\}$ then $V \otimes W$ is a vector space with basis $\{v_i \otimes w_i\}$.

There is more theory behind it than that. I'm sure you've read stuff about it being universal with respect to bilinear maps and such. But in terms of "building spaces out of other spaces" it's not that complicated. A tensor product of an $n$-dimensional vector space with an $m$-dimensional vector space is just an $nm$-dimensional vector space.


Every tensor is associated with a linear map that produces a scalar.

For instance, a vector can be identified with a map that takes in another vector (in the presence of an inner product) and produces a scalar. If I have a vector $v$ and some input vector $a$, then I define the map $\underline v(a) \equiv v \cdot a$.

A matrix is just a representation of a map that takes in two vectors. Usually we say matrices take in vectors and produce vectors. $T(a) \mapsto a'$ for instance. But you can instead use the inner product and say there is a map $\underline T(a, b)$ which produces a scalar by $\underline T(a, b) = T(a) \cdot b$.

Tensors obey certain transformation laws. A change of basis for a matrix is a similarity transformation; for tensors the rule is just a generalization of this idea. This gives a way to compute the components of a tensor in a new basis, but the underlying map can be thought of as unchanging, the same way a vector expressed in a new basis is geometrically considered the same as it was before.

Some tensors correspond to geometric objects or primitives. As I said, vectors can be thought of as very simple tensors. Some other tensors correspond to planes, volumes, and so on, formed directly from 2, 3, or more vectors. Clifford algebra is a part of the tensor algebra, dealing directly with such geometrically significant objects. Not all tensors are so easily to visualize or imagine, though; the rest can only be thought of abstractly as maps.