Maximum number of linearly independent anti commuting matrices

I can answer the question, under the additional hypothesis that at least two of the matrices are invertible. I haven't figured out how to do it without that hypothesis.

Note that if you had $n^2$ such matrices, you'd have a basis for the space of all $n\times n$ matrices. If you can prove that all the matrices have trace zero, then you get a contradiction, since any linear combination of trace zero matrices has trace zero, but not every matrix has trace zero.

So suppose $A,B$ are anticommuting, that is, $AB=-BA$, and suppose $A$ is invertible. Then $B=A^{-1}(-B)A$, which says that $B$ is similar to $-B$. But similar matrices have the same trace, and the trace of $-B$ is the additive inverse of the trace of $B$, so $B$ has trace zero. So the presence of one invertible matrix in the set of anticommuting matrices forces all the other matrices in the set to have trace zero. Thus if there are two (or more) invertible matrices in the set, then all of the matrices in the set have trace zero, so they can't be a basis, so there can't be $n^2$ of them.

I wonder whether it's true that $AB=-BA$ implies $A$ and $B$ have trace zero, even if they are not invertible. If you can prove that, you win.


Suppose there were $n^2$ pairwise anticommuting matrices, i.e. as Gerry points out, a basis for $n\times n$ matrices.

If the field were of characteristic two, then anticommuting would amount to commuting matrices. Aside: A previous Question addresses how many linearly independent commuting matrices one can have, with a sharp count known for most fields. In any case it is evident that we cannot have a basis of commuting matrices unless all matrix multiplications commute, so in particular in characteristic two we have at most $n^2 - 1$ pairwise (anti)commuting linearly independent matrices unless $n=1$.

For the rest of this post we consider only fields not of characteristic two.

Express the identity matrix as a linear combination of these, say $I= \sum r_i A_i$ where, without loss of generality, $r_1\neq 0$.

Clearly $A_1$ commutes with $I-r_1 A_1$ but anticommutes with the equal expression $\sum_{i\neq 1} r_i A_i $. Therefore the product must be zero, implying:

$$ A_1 = r_1 A_1^2 $$

and it follows that $r_1 A_1$ is idempotent. The same argument shows any nonzero $r_i A_i $ in the sum is idempotent.

Indeed each of the $n^2$ matrices $A_i $ must appear in the representation, for if it did not, it would anti commute with all those that do appear and (because it would anticommute with the identity matrix) be zero (not possible). Thus we have an equal number $n^2$ of linearly independent anti-commuting idempotent matrices $B_i = r_i A_i$ whose sum is $I$.

Now we can get a contradiction about the rank of $I$ being $n$. In characteristic zero Gerry's idea to use the trace works quickly. The trace of an idempotent is its rank, so the trace of the identity matrix would be at least $n^2$. Contradiction (except in the vacuous case $n=1$).

For a field of nonzero characteristic $p \gt 2$ we dig a little deeper. Matrices $B_i,B_j$ not only anti-commute, they actually annihilate one another, $B_i B_j = 0$.

For any column $v$ of idempotent $B_j$, $B_j v = v$. For $i \neq j$, then $B_j B_i v = -B_i B_j v = -B_i v$. But idempotent $B_i$ has no eigenvalue $-1$, so $B_i v = 0$. This is true for each column of $B_j$, so $B_i B_j = 0$.

It follows that the column spaces of $B_i,B_j$ have trivial intersection, and so their (nonzero) ranks are additive: $B_i + B_j$ is again idempotent and $\text{rank}(B_i + B_j) = \text{rank} B_i + \text{rank} B_j$. Thus we can reach the same impossibility, that rank of $I = \sum B_i$ is at least $n^2$, without relying on the trace operator.