Eigenvalues of the product of two symmetric matrices

Here are the results that you are probably looking for.

The first one is for positive definite matrices only (the theorem cited below fixes a typo in the original, in that the correct version uses $\prec_w$ instead of $\prec$).

Theorem (Prob.III.6.14; Matrix Analysis, Bhatia 1997). Let $A$ and $B$ be Hermitian positive definite. Let $\lambda^\downarrow(X)$ denote the vector of eigenvalues of $X$ in decreasing order; define $\lambda^\uparrow(X)$ likewise. Then, \begin{equation*} \lambda^\downarrow(A) \cdot \lambda^\uparrow(B) \prec_w \lambda(AB) \prec_w \lambda^\downarrow(A) \cdot \lambda^\downarrow(B), \end{equation*}

where $x \cdot y := (x_1y_1,\ldots ,x_ny_n)$ for $x,y \in \mathbb{R}^n$ and $\prec_w$ is the weak majorization preorder.

However, when dealing with matrix products, it is more natural to consider singular values rather than eigenvalues.

Therefore, the relation that you might be looking for is the log-majorization \begin{equation*} \log \sigma^\downarrow(A) + \log\sigma^\uparrow(B) \prec \log\sigma(AB) \prec \log\sigma^\downarrow(A) + \log\sigma^\downarrow(B), \end{equation*} where $A$ and $B$ are arbitrary matrices, and $\sigma(\cdot)$ denotes the singular value map.

Reference

  1. R. Bhatia. Matrix Analysis. Springer, GTM 169. 1997.