How to prove that $\det(A)$ can be expressed as a $n \times n$ determinant with entries $\operatorname{tr}(A^k)$

The Newton identities relating power sums of eigenvalues $s_k=tr(A^k)$ to the coefficients of the characteristic polynomial $\chi_A(t)=t^n+c_1t^{n-1}+\dots+c_{n-1}t+c_n$ with $c_n=(-1)^n\det(A)$ read as

\begin{align} s_1 &= -c_1,\\ s_2 &= -c_1 s_1 - 2 c_2,\\ s_3 &= -c_1 s_2 - c_2 s_1 - 3 c_3,\\ s_4 &= -c_1 s_3 - c_2 s_2 - c_3 s_1 - 4 c_4, \\ & {} \ \ \vdots\\ s_n &= -c_1 s_{n-1}-\dots-c_{n-1} s_1 - n c_n \end{align}

which can be written as a matrix-vector system

$$-\begin{bmatrix} s_1\\s_2\\s_3\\\vdots\\s_{n-1}\\s_n \end{bmatrix} = \begin{bmatrix} 1&0&0&\dots&0&0\\ s_1&2&0&\dots&0&0\\ s_2&s_1&3&&0&0\\ \vdots&\vdots&&&&\vdots\\ s_{n-2}&s_{n-3}&s_{n-4}&\dots&n-1&0\\ s_{n-1}&s_{n-2}&s_{n-3}&\dots&s_1&n \end{bmatrix} \begin{bmatrix} c_1\\c_2\\c_3\\\vdots\\c_{n-1}\\c_n \end{bmatrix}$$

or

$$\begin{bmatrix} 1\\0\\0\\0\\\vdots\\0\\0 \end{bmatrix} = \begin{bmatrix} 1&0&0&0&\dots&0&0\\ s_1&1&0&0&\dots&0&0\\ s_2&s_1&2&0&\dots&0&0\\ s_3&s_2&s_1&3&&0&0\\ \vdots&\vdots&\vdots&&&&\vdots\\ s_{n-1}&s_{n-2}&s_{n-3}&s_{n-4}&\dots&n-1&0\\ s_n&s_{n-1}&s_{n-2}&s_{n-3}&\dots&s_1&n \end{bmatrix} \begin{bmatrix} 1\\c_1\\c_2\\c_3\\\vdots\\c_{n-1}\\c_n \end{bmatrix}$$

Now apply Cramers rule to the computation of $c_n$ to obtain the stated formula.


Note that the solution of this triangular system constitutes the computational core of the Leverrier-Faddejev algorithm for the (mostly) division free computation of the characteristic polynomial of a matrix.


Determinants and traces are invariant and everything is polynomial, so it is enough to check it for diagonal matrices.