Eigenvalues of $A$ and $A + A^T$

EDIT: Let $\lambda\in spectrum(A)$; then there is $\mu \in spectrum(A+A^T)$ s.t. $|\lambda-\mu|\leq ||A||_2$ (spectral norm).

Proof: Since $A+A^T$ is real symmetric, according to the Bauer–Fike Theorem, there is $\mu\in spectrum(A+A^T)$ s.t. $|\lambda-\mu|\leq ||-A^T||_2=||A||_2$. cf. http://en.wikipedia.org/wiki/Bauer%E2%80%93Fike_theorem


This question was answered here, see also comments to its "closed-as-duplicate" post, especially ones by Terry Tao.

Here is Tao's comment from, now deleted, post 2:

"Note that if A is strictly upper triangular, then its eigenvalues are all zero, whereas $A+A^T$ is an arbitrary symmetric matrix with zero diagonal, which constrains the trace of the matrix but otherwise imposes almost no conditions on the spectrum whatsoever (the only other constraint I can see is that the matrix cannot be rank one). So, apart from the trace $tr(A+A^T)=2tr(A)$, there appears to be essentially no relationship."