A "geometric'' infinite sum of matrices

FWIW: Call $S$ the sum, provided it makes sense. Then $A^TSA=S-I$, so the series converges to a root of this equation.

If $n=2$, then the determinant of the endomorphism $S\mapsto A^TSA-S$ of $M_n(\mathbb C)$ is $(\det A-1)^2\det(A^2-I)$, so for most $A$s, the solution is unique. For larger $n$, I have no idea.


If $F:M_n\to M_n$ is defined by $F(X)=A^\mathrm{T}XA$ and $\mathrm{id}:M_n\to M_n$ is the identity map, then your sum is $(\mathrm{id} - F)^{-1}(I)$.


Let $S$ be the solution to the discrete-time Lyapunov equation $A^TSA=S-I$. Note that $S$ exists and is unique because $$ A^TSA=S-I\ \Leftrightarrow\ (I - A^T\otimes A^T)\textrm{vec}(S)=\textrm{vec}(I) $$ and $I - A^T\otimes A^T$ is invertible owing to the fact that $\rho(A^T\otimes A^T)=\rho(A)^2<1$.

It follows that if the infinite series $I + A^T A + (A^2)^T A^2 + (A^3)^T A^3 + \cdots$ converges at all, its limit must be $S$.

But does it really converge? Note that for some matrix $A$ ⸺ such as $\frac1{\sqrt{2}}\pmatrix{1&1\\ 0&1}$ ⸺ we have $\rho(A)<1$ but also $\max\left\{\|A\|,\|A^T\|\right\}>1$ for every submultiplicative matrix norm. So, we cannot argue that $\|A\|\|A^T\|<1$ and pretend that $\|\sum_{k=0}^\infty(A^j)^TA^j\|\le\sum_{j=0}^\infty(\|A\|\|A^T\|)^j<\infty$ (the first inequality is always valid but the second one is not). The usual trick for proving the convergence of a Neumann series using submultiplicative norm is not applicable here.

Yet, we can still easily prove that the partial sums are convergent to $S$. Since $A^TSA=S-I$, one can prove by mathematical induction that $$ S-\sum_{j=0}^{n-1}(A^j)^TA^j=(A^n)^TSA^n $$ for every $n\ge1$. As $\rho(A)<1$, we have $A^n\rightarrow0$ and in turn $(A^n)^TSA^n\rightarrow0$ when $n$ approaches infinity. Hence the infinite series $\sum_{j=0}^\infty(A^j)^TA^j$ does converge to $S$.

Alternatively, we can vectorise the $n$-th partial sum of the infinite series as follows: $$ \operatorname{vec}\left(\sum_{j=0}^n(A^j)^T A^j\right) =\sum_{j=0}^n\left((A^j)^T \otimes (A^j)^T\right)\operatorname{vec}(I) =\sum_{j=0}^n\left(A^T\otimes A^T\right)^j\operatorname{vec}(I).\tag{$\ast$} $$ Now we get a geometric sum on the RHS of $(\ast)$. This effectively resurrects the Neumann series argument. As $\rho\left(A^T\otimes A^T\right)=\rho(A)^2<1$, we have $\|A^T\otimes A^T\|<1$ for some submultiplicative norm. The RHS of $(\ast)$ thus converges when $n\to\infty$ and the LHS converges too.