Properties of matrix $X=\left[\frac{1}{1-\bar\alpha_i \alpha_j}\right]_{ij}$

This matrix is also related to the Nevanlinna-Pick Theorem. Namely, if $z_i, \lambda_i \in \mathbb D, 1\leq i\leq n$ then $$\left[\begin{matrix} \frac{1- \overline{z_j}z_i}{1-\overline{\lambda_j}\lambda_i}\end{matrix}\right]_{i,j=1}^n \geq 0$$ if and only if there is a holomorphic function $\varphi : \mathbb D \rightarrow \overline{\mathbb D}$ such that $\varphi(\lambda_i) = z_i, 1\leq i\leq n$.

In this theory $K(\lambda_j,\lambda_i) = (1-\overline{\lambda_j}\lambda_i)^{-1}$ is called the Szego kernel. So one could refer to your matrix as the Szego kernel matrix.


Assuming all the $\alpha_j$ are nonzero, the matrices $X$ are Cauchy-like matrices, since you can rewrite them as $$ X_{ij} = \frac{\alpha_j^{-1}}{\alpha_j^{-1}-\bar{\alpha}_i} $$ so there are analogous formulas for their determinant and inverse. In particular, $XA^{-1}$ is a Cauchy matrix, where $A = diag(\alpha_i)$, so these formulas follow directly from those for Cauchy matrices.

A variant of this equivalence that does not require the invertibility of $A$ and better exploits symmetry/Hermitianity is the following. Let $B = (I+A)(A-I)^{-1}$ (note that $\alpha_i\neq 1$, otherwise there would be a zero denominator in $X_{ii}$, so $A-I$ is invertible). Then, $$ B^*X+XB = -2(A- I)^{-*}E(A- I)^{-1} $$ expands to $$ ( I+A^*)X(A- I) + (A^*- I)X( I + A) = -2E $$ which reduces to $$ 2A^*XA - 2X = -2E, $$ which is the Stein equation that you used to define $X$.

So $X$ solves a Lyapunov equation with a diagonal $B$, and hence one can write the more symmetric formula

$$X_{ij} = \frac{-2(\alpha_i-1)^{-*}(\alpha_j-1)^{-1}}{\bar{\beta}_i + \beta_j},$$

where $\beta_i = \frac{\alpha_i + 1}{\alpha_i-1}$ are the diagonal entries of $B$. Note that $\Re\beta_i < 0$ iff $|\alpha_i| < 1$.

Alternatively, the Hermitian matrix $(A^*-I)^{-1}X(A-I)^{-1}$ is a Cauchy matrix wrt the two sequences $\bar{\beta}_i$ and $-\beta_j$.

This trick to convert between Lyapunov and Stein equations is classical (bilinear transform, or Cayley transform).


These matrices are a special case of a much broader class of positive semidefinite (psd) matrices constructed as follows. Let $(\alpha_{i,j})_{i,j=1}^n$ be any psd matrix satisfying $|\alpha_{i,j}| < 1$ for all $i,j \in \{1,\ldots,n\}$. Then the matrix $((1-\alpha_{i,j})^{-c})_{i,j=1}^n$ is positive semidefinite for all $c \geq 0$. This follows immediately from the geometric series formula since an entrywise power of a psd matrix is psd. The matrix you mention is constructed in this way from the matrix $(\alpha_1,\ldots,\alpha_n)^*(\alpha_1,\ldots,\alpha_n)$ which is transparently psd.