Sum of squares of roots of a polynomial $P(x)$

If you have a polynomial $$P(x) = a_nx^n + a_{n-1}x^{n-1}+\cdots + a_1x + a_0$$ and $a_n\neq 0$, $a_0\neq 0$, then consider the "reversal polynomial" $Q(x)$, $$Q(x) = a_0x^n + a_{1}x^{n-1} + \cdots + a_{n-1}x + a_n.$$

Then $r$ is a root of $P(x)$ if and only if $\frac{1}{r}$ is a root of $Q(x)$. Indeed, if $P(r)=0$, then $$\begin{align*} r^nQ\left(\frac{1}{r}\right) &= r^n\left(a_0\left(\frac{1}{r}\right)^n + \cdots + a_{n-1}\left(\frac{1}{r}\right) + a_n\right)\\ &= a_0 + a_1r + \cdots + a_{n-1}r^{n-1}+ a_nr^n\\ &= P(r) = 0. \end{align*}$$ Since $r\neq 0$, it follows that $Q(\frac{1}{r})=0$. Since $P$ is the reversal of $Q$, the symmetric argument establishes the converse implication.

So what you are trying to do is essentially equivalent to finding the sums of powers of the roots of a polynomial.

These can be obtained from the coefficients by using Newton's identities, which express them in terms of the elementary symmetric polynomials, which are in turn equal to $\frac{a_i}{a_n}$ by the factor theorem.