What does it mean for two matrices to be orthogonal?

There are two possibilities here:

  1. There's the concept of an orthogonal matrix. Note that this is about a single matrix, not about two matrices. An orthogonal matrix is a real matrix that describes a transformation that leaves scalar products of vectors unchanged. The term "orthogonal matrix" probably comes from the fact that such a transformation preserves orthogonality of vectors (but note that this property does not completely define the orthogonal transformations; you additionally need that the length is not changed either; that is, an orthonormal basis is mapped to another orthonormal basis). Another reason for the name might be that the columns of an orthogonal matrix form an orthonormal basis of the vector space, and so do the rows; this fact is actually encoded in the defining relation $A^TA = AA^T = I$ where $A^T$ is the transpose of the matrix (exchange of rows and columns) and $I$ is the identity matrix.

    Usually if one speaks about orthogonal matrices, this is what is meant.

  2. One can indeed consider matrices as vectors; an $n\times n$ matrix is then just a vector in an $n^2$-dimensional vector space. In such a vector space, one can then define a scalar product just as in any other vector space. It turns out that for real matrices, the standard scalar product can be expressed in the simple form $$\langle A,B\rangle = \operatorname{tr}(AB^T)$$ and thus you can also define two matrices as orthogonal to each other when $\langle A,B\rangle = 0$, just as with any other vector space.

    To imagine this, you simply forget that the matrices are matrices, and just consider all matrix entries as components of a vector. The two vectors then are orthogonal in the usual sense.


It is not common to say that two matrices are orthogonal to each other, but rather one speaks of a matrix being an orthogonal matrix.

Formally, a matrix $A$ is called orthogonal if $A^TA = AA^T = I$. In other words, the columns of the matrix form a collection of orthogonal (and normed vectors); if you take two distinct columns they are orthogonal as vectors. (You could also consider rows.)

"Physically" an orthogonal matrix corresponds to a distance preserving linear transformation (such as a rotation) of the space.


$\newcommand{\Reals}{\mathbf{R}}$Let $m$ and $n$ be positive integers. The set $\Reals^{m \times n}$ of all real $m \times n$ matrices is "essentially" the space $\Reals^{mn}$ of all real vectors with $mn$ components: Just put the entries of a matrix into a single column in some specified order. The ordinary dot product in $\Reals^{mn}$ allows us to speak of Euclidean geometry in the space of $m \times n$ matrices. In particular, we may speak of two matrices being "orthogonal" if their dot product is zero.

It turns out that the dot product of two $m \times n$ real matrices $A$ and $B$ has a simple formula, $\operatorname{tr}(A^{T}B)$. (In words, multiply the transpose of $A$ by $B$, then add up the diagonal entries of the resulting square matrix.)

However, as quid says, there's a completely different definition of "orthogonal" as a property of a single $n \times n$ matrix, which is "If $u$ and $v$ are vectors in $\Reals^{n}$, then $u \cdot v = 0$ if and only if $Au \cdot Av = 0$." This turns out to be equivalent to "Multiplication by $A$ preserves all concepts of Euclidean geometry", or "The columns of $A$ form an orthonormal basis of $\Reals^{n}$", among others.

If someone comes to you and says, "Let $A$ and $B$ be two orthogonal matrices, ...", quid's interpretation is probably what they mean. But if lives depend on it, clarify with the asker. :)