Given a matrix, is there always another matrix which commutes with it?

A square matrix $A$ over a field $F$ commutes with every $F$-linear combination of non-negative powers of $A$. That is, for every $a_0,\dots,a_n\in F$,

$$A\left(\sum_{k=0}^na_kA^k\right)=\sum_{k=0}^na_kA^{k+1}=\left(\sum_{k=0}^na_kA^k\right)A\;.$$

This includes as special cases the identity and zero matrices of the same dimensions as $A$ and of course $A$ itself.

Added: As was noted in the comments, this amounts to saying that $A$ commutes with $p(A)$ for every polynomial over $F$. As was also noted, there are matrices that commute only with these. A simple example is the matrix $$A=\pmatrix{1&1\\0&1}:$$ it’s easily verified that the matrices that commute with $A$ are precisely those of the form $$\pmatrix{a&b\\0&a}=bA+(a-b)I=bA^1+(a-b)A^0\;.$$ At the other extreme, a scalar multiple of an identity matrix commutes with all matrices of the same size.


Given a square $n$ by $n$ matrix $A$ over a field $k,$ it is always true that $A$ commutes with any $p(A),$ where $p(x)$ is a polynomial with coefficients in $k.$ Note that in the polynomial we take the constant $p_0$ to refer to $p_0 I$ here, where $I$ is the identity matrix. Also, by Cayley-Hamilton, any such polynomial may be rewritten as one of degree no larger than $(n-1),$ and this applies also to power series such as $e^A,$ although in this case it is better to find $e^A$ first and then figure out how to write it as a finite polynomial.

THEOREM: The following are equivalent:

(I) $A$ commutes only with matrices $B = p(A)$ for some $p(x) \in k[x]$

(II) The minimal polynomial and characteristic polynomial of $A$ coincide; note that, if we enlarge the field to more than the field containing the matrix entries, neither the characteristic nor the minimal polynomial change. Nice proofs for the minimal polynomial at Can a matrix in $\mathbb{R}$ have a minimal polynomial that has coefficients in $\mathbb{C}$?

(III) $A$ is similar to a companion matrix.

(IV) if necessary, taking a field extension so that the characteristic polynomial factors completely, each characteristic value occurs in only one Jordan block.

(V) $A$ has a cyclic vector, that is some $v$ such that $ \{ v,Av,A^2v, \ldots, A^{n-1}v \} $ is a basis for the vector space.

See GAILLARD MINIMAL SIMILAR COMPANION

The equivalence of (II) and (III) is Corollary 9.43 on page 674 of ROTMAN

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Theorem: If $A$ has a cyclic vector, that is some $v$ such that $$ \{ v,Av,A^2v, \ldots, A^{n-1}v \} $$ is a basis for the vector space, then $A$ commutes only with matrices $B = p(A)$ for some $p(x) \in k[x].$

Nice short proof by Gerry at Complex matrix that commutes with another complex matrix.

This is actually if and only if, see Statement and Proof

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Note that, as in the complex numbers, if the field $k$ is algebraically closed we may then ask about the Jordan Normal Form of $A.$ In this case, the condition is that each eigenvalue belong to only a single Jordan block. This includes the easiest case, when all eigenvalues are distinct, as then the Jordan form is just a diagonal matrix with a bunch of different numbers on the diagonal, no repeats.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

For example, let $$ A \; = \; \left( \begin{array}{ccc} \lambda & 0 & 0 \\ 0 & \lambda & 1 \\ 0 & 0 & \lambda \end{array} \right). $$

Next, with $r \neq s,$ take $$ B \; = \; \left( \begin{array}{ccc} r & 0 & 0 \\ 0 & s & t \\ 0 & 0 & s \end{array} \right). $$

We do get

$$ AB \; = \; BA \; = \; \left( \begin{array}{ccc} \lambda r & 0 & 0 \\ 0 & \lambda s & \lambda t + s \\ 0 & 0 & \lambda s \end{array} \right). $$ However, since $r \neq s,$ we know that $B$ cannot be written as a polynomial in $A.$

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

A side note, Computing the dimension of a vector space of matrices that commute with a given matrix B, answer by Pedro. The matrices that commute only with polynomials in themselves are an extreme case, dimension of that vector space of matrices is just $n.$ The other extreme is the identity matrix, commutes with everything, dimension $n^2.$ For some in-between matrix, what is the dimension of the vector space of matrices with which it commutes?

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=

Showing what must be the form of a square block if it commutes with a Jordan block. There is no loss in taking the eigenvalue for the Jordan block as zero, and makes it all cleaner:

parisize = 4000000, primelimit = 500000
? jordan = [ 0,1;0,0]
%1 = 
[0 1]

[0 0]

? symbols = [a,b;c,d]
%2 = 
[a b]

[c d]

? jordan * symbols - symbols * jordan
%3 = 
[c -a + d]

[0     -c]

? 
? 
? symbols = [ a,b,c;d,e,f;g,h,i]
%4 = 
[a b c]

[d e f]

[g h i]

? jordan = [ 0,1,0; 0,0,1; 0,0,0]
%5 = 
[0 1 0]

[0 0 1]

[0 0 0]

? jordan * symbols - symbols * jordan
%6 = 
[d -a + e -b + f]

[g -d + h -e + i]

[0     -g     -h]

? 
? 
? jordan = [ 0,1,0,0; 0,0,1,0; 0,0,0,1;0,0,0,0]
%7 = 
[0 1 0 0]

[0 0 1 0]

[0 0 0 1]

[0 0 0 0]

? symbols = [ a,b,c,d; e,f,g,h; i,j,k,l; m,n,o,p]
%8 = 
[a b c d]

[e f g h]

[i j k l]

[m n o p]

? jordan * symbols - symbols * jordan
%9 = 
[e -a + f -b + g -c + h]

[i -e + j -f + k -g + l]

[m -i + n -j + o -k + p]

[0     -m     -n     -o]

=================================


This doesn't want to be a complete answer, but just a hint to understand the problem a bit better.

Note that the commutativity $AB=BA$ is equivalent (when $B$ is invertible) to $A=BAB^{-1}$.

On the other hand conjugate matrices represent the same endomorphism of the underlying vector space with respect to different basis. Thus $A=BAB^{-1}$ means that the endomorphism corresponding to $A$ has the same matrix representation when the base changed by $B$.

How could that be possible?

If $A$ is diagonalizable, i.e. the vector space admits a basis of eigenvectors, with distinct eigenvalues, the only way that we can modify the basis and leave $A$ as matrix representing the endomorphism is to the effect of replacing the eigenvectors with some multiples. This corresponds to the situation where the only matrices commuting with $A$ are the linear combinations of its powers.

On the other hand, if there is an eigenspace $E_\lambda$ of dimension $\geq2$ we can replace any choice of basis of $E_\lambda$ with any other, and the matrix representing the endomorphism will be left the same. Thus we should expect more matrices commuting with $A$ in this case.

Hope this helps understanding the problem.