Are matrices rank 2 tensors?

This question doesn't have a single good answer, because there isn't a universally agreed upon definition of "tensor" in mathematics. In particular:

  1. Tensors are sometimes defined as multidimensional arrays, in the same way that a matrix is a two-dimensional array. From this point of view, a matrix is certainly a special case of a tensor.

  2. In differential geometry and physics, "tensor" refers to a certain kind of object that can be described at a point on a manifold (though the word "tensor" is often used to refer to a tensor field, in which one tensor is chosen for every point). From this point of view, a matrix can be used to describe a rank-two tensor in local coordinates, but a rank-two tensor is not itself a matrix.

  3. In linear algebra, "tensor" sometimes refers to an element of a tensor product, and sometimes refers to a certain kind of multilinear map. Again, neither of these is a generalization of "matrix", though you can get a matrix from a rank-two tensor if you choose a basis for your vector space.

You run into the same problem if you ask a question like "Is a vector just a tuple of numbers?" Sometimes a vector is defined as a tuple of numbers, in which case the answer is yes. However, in differential geometry and physics, the word "vector" refers to an element of the tangent space to a manifold, while in linear algebra, a "vector" may be any element of a vector space.

On a basic level, the statement "a vector is a rank 1 tensor, and a matrix is a rank 2 tensor" is roughly correct. This is certainly the simplest way of thinking about tensors, and is reflected in the Einstein notation. However, it is important to appreciate the subtleties of this identification, and to realize that "tensor" often means something slightly different and more abstract than a multidimensional array.


The connection is this: a matrix consists of the coefficients of a (1,1) tensor, but it is not a tensor itself.

Suppose we are talking about a linear transformation $T$ on an $n$ dimensional vector space $V$.

Now $T$ is certainly a tensor (tensors are, after all, multilinear maps on copies of $V$ and $V^\ast$, and a linear transformation can be interpreted as a multilinar function from $V\times V^\ast$ to $\mathbb{F}$.)

Once a basis for $V$ is fixed, then you can talk about the matrix $A$ for $T$ which is written in terms of the basis. The same can be said for general multilinear functions on copies of $V$ and $V^\ast$, that after you have fixed a basis, you have a big array holding its coefficients.

It's important to remember not to confuse the array for the tensor. The tensor is a basis independent entity: it's a kind of function. The components are just one particular representation of that function, and the components depend upon a choice of basis.