Trace of a nonlinear matrix equation (cont'd)

Actually, you have completely solved it yourself, just didn't dare to acknowledge it. In my notation, you have $(X\circ X^T)v(A)=(Y\circ Y^T)v(I)$ when $Y^2=XAX$. Similarly, $(Z\circ Z^T)v(I)=(Y\circ Y^T)v(A)$ when $Z^2=YAY$. Taking the trace, we must have $$ 1=\langle(X\circ X^T)v(I),v(I)\rangle=\langle(Y\circ Y^T)v(I),v(I)\rangle=\langle(Z\circ Z^T)v(I),v(I)\rangle\,. $$ However, $$ \langle(Y\circ Y^T)v(I),v(I)\rangle=\langle(X\circ X^T)v(A),v(I)\rangle $$ and $$ \langle(Z\circ Z^T)v(I),v(I)\rangle=\langle(Y\circ Y^T)v(A),v(I)\rangle \\ = \langle(Y\circ Y^T)v(I),v(A)\rangle=\langle(X\circ X^T)v(A),v(A)\rangle $$ so $$ \langle(X\circ X^T)(v(A)-v(I)),(v(A)-v(I))\rangle=0 $$ whence $X\circ X^T$ and, thereby, $X$ must be degenerate unless $v(A)=v(I)$, i.e., $A=I$.

This story definitely has a few morals but I'll abstain from spelling them out :-).


I sketch here a solution for the $2\times 2$ case. My hope is that the outlined approach can be used to find a general solution.

As noticed in Addendum 1, we can restrict wlog to diagonal positive definite $A$'s $$ A := \begin{bmatrix}d_1 & 0 \\ 0 & d_2\end{bmatrix}. $$ I consider the case $d_1>1$ and $0<d_2<1$, the other cases being trivial. Define $X_0^{1/2}$ as $$ X_0^{1/2} := \begin{bmatrix}a & b \\ b & c\end{bmatrix}. $$ Since $X_0>0$ and hence $X_0^{1/2}>0$ it holds $a>0$, $b>0$, and $ac-b^2>0$. Moreover since $\mathrm{tr}(X_0)=1$, we have that $$ a^2+2b^2+c^2=1. \quad (1) $$ Now after one iteration step we obtain \begin{align} X_1 &= X_0^{1/2}AX_0^{1/2}= \begin{bmatrix} a^2d_1+b^2d_2 & \ast \\ \ast & b^2d_1+c^2d_2\end{bmatrix}. \end{align} Due to the fact that $\mathrm{tr}(X_1)=1$, it follows that $$ a^2d_1+b^2(d_1+d_2)+c^2d_2=1. \quad (2) $$ Let us define $$ X_1^{1/2} := \begin{bmatrix} a_1 & b_1 \\ b_1 & c_1\end{bmatrix} $$ Since $X_1=X_1^{1/2}X_1^{1/2}$, we get \begin{align} a_1^2+b_1^2=a^2d_1+b^2d_2 \ \ \text{ and }\ \ b_1^2+c_1^2=b^2d_1+c^2d_2. \quad(\#) \end{align} Now consider the second iteration step $$ X_2 = X_1^{1/2}AX_1^{1/2}= \begin{bmatrix} a_1^2d_1+b_1^2d_2 & \ast \\ \ast & b_1^2d_1+c_1^2d_2\end{bmatrix}. $$ We have \begin{align} \mathrm{tr}(X_2) &= d_1(a_1^2+b_1^2)+d_2(b_1^2+c_1^2)\\ &\overset{(\#)}{=}a^2d_1+2b^2d_1d_2+c^2d_2^2=1,\quad (3) \end{align} by virtue of $(\#)$ and of the trace constraint $\mathrm{tr}(X_2)=1$. By collecting $(1)$, $(2)$, and $(3)$ we arrive at the following linear system $$ \begin{bmatrix} 1 & 2 & 1 \\ d_1 & d_1+d_2 & d_2 \\ d_1^2 & d_1d_2 & d_2^2\end{bmatrix}\begin{bmatrix}a^2 \\ b^2\\ c^2\end{bmatrix}=\begin{bmatrix}1 \\ 1\\ 1\end{bmatrix}. $$ The solution $(\hat{a}^2, \hat{b}^2, \hat{c}^2)$ of the previous system — you can evaluate it manually or using some symbolic toolbox, as I did — is such that $$ \sqrt{\hat a^2 \hat c^2}-\hat b^2 = \frac{|(d_1-1)(d_2-1)|+(d_1-1)(d_2-1)}{(d_1-d_2)^2}=0 $$ since $d_1>1$ and $0<d_2<1$, by assumption. But, in this case, we get that $X_0^{1/2}$ is singular, which is a contradiction since $X_0>0$.


Update. It is possible to generalize the above procedure to the case $$ A = \begin{bmatrix}d_1I_{n_1} & 0\\ 0&d_2I_{n_2}\end{bmatrix} $$ with $d_1>1$ and $0<d_2<1$ scalars. Indeed, by considering the following block decomposition of $X_0^{1/2}$ $$ X_0^{1/2} := \begin{bmatrix}A & B \\ B^\top & C\end{bmatrix}, $$ by replacing $a$, $c$, and $b$ with $\mathrm{tr}(A^2)$, $\mathrm{tr}(C^2)$, and $\mathrm{tr}(B^\top B)$, and by following almost verbatim the above solution we get the desired result.


First, let me modify the given problem for the sake of simplicity.


Given an $n \times n$ matrix $X_0>0$ with unit trace, i.e., $$ X_0 = \sum_{1 \le i \le n} \lambda_i v_i v_i^T \;, \quad \sum_{1 \le i \le n} \lambda_i = 1 $$ where $\{ \lambda_i \}$ and $\{ v_i \}$ are the positive eigenvalues and orthonormal eigenvectors of $X_0$, respectively. Set $A>0$ to be $$ A = \sum_{1 \le i \le n} \alpha_i v_i v_i^T \tag{$\diamond$} $$ where $\{ \alpha_i \}$ are positive eigenvalues of $A$, and we stress that $A$ has the same eigenvectors as $X_0$. If the solution to the matrix recurrence relation $$ X_{k+1} = X_k^{1/2} A X_k^{1/2} $$ with initial data $X_0$ satisfies $\operatorname{trace}(X_{k})=1$ for all natural numbers $k \ge 0$, then $A=I_n$, i.e., $\alpha_i = 1$ for all $1 \le i \le n$.


Remark. The difference between this formulation and the OP's formulation is that $A$ has the same eigenvectors as the given seed $X_0$. Admittedly, this form of $A$ is restrictive, but it is nicer to work with because, as we will see, it simplifies the subsequent calculations.

For the sake of contradiction, suppose that $A \ne I_n$, i.e., not all $\alpha_i = 1$ for $1 \le i \le n$. Since $$ X_0^{1/2} = \sum_{1 \le i \le n} \sqrt{\lambda_i} v_i v_i^T $$ after one step of the recurrence relation we have that: \begin{align*} X_1 &= \sum_{1 \le i,j \le n} \sqrt{\lambda_i} \sqrt{\lambda_j} (v_i^T A v_j) v_i v_j^T \\ &= \sum_{1 \le i,j,k \le n}\sqrt{\lambda_i} \sqrt{\lambda_j} \alpha_k (v_i^T v_k) (v_k^T v_j) v_i v_j^T \\ &= \sum_{1 \le i,j \le n}\sqrt{\lambda_i} \sqrt{\lambda_j} \alpha_i \delta_{ij} v_i v_j^T \qquad \text{($\delta_{ij}$ is the Kronecker delta)}\\ &= \sum_{i=1}^n \lambda_i \alpha_i v_i v_i^T \end{align*} and the unit trace requirement implies that $$ \sum_{i=1}^n \lambda_i \alpha_i = 1 \;. $$ Iterating the above calculation $k$ times, we see that the unit trace requirement implies that $\sum_{i=1}^n \lambda_i \alpha_i^k = 1 $ which, according to our hypotheses, must hold true for any natural number $k \ge 0$. A more transparent way to write this requirement is as: $$ \mathbf{V}^T \boldsymbol{\lambda} = \mathbf{1} \tag{$\star$} $$ where we have introduced an infinite Vandermonde-like matrix and two vectors: $$ \mathbf{V} = \begin{bmatrix} 1 & \alpha_1 & \cdots & \alpha_1^k & \cdots \\ 1 & \alpha_2 & \cdots & \alpha_2^k & \cdots \\ \vdots & \vdots & \ddots & \vdots & \cdots \\ 1 & \alpha_n & \cdots & \alpha_n^k & \cdots \end{bmatrix} \;, \quad \boldsymbol{\lambda} = \begin{bmatrix} \lambda_1 \\ \vdots \\ \lambda_n \end{bmatrix} \;, \quad \mathbf{1} = \begin{bmatrix} 1 \\ 1 \\ \vdots \end{bmatrix} $$ If $\alpha_1 = \cdots = \alpha_n = 1$, then the solution set of ($\star$) basically contains all unit trace matrices that are positive definite. (This is the trivial case.) However, if even one of the eigenvalues of $A$ is not equal to one (the case at hand), then the solution set to this infinite (overdetermined) system of equations is empty, and there exists no $X_0$ that satisfies the hypotheses given above, which is a contradiction that is resolved only if $\alpha_1 = \cdots = \alpha_n = 1$ or $A=I_n$.