Eigenvectors and eigenvalues of a tridiagonal Toeplitz matrix

Here is the calculation of the spectrum of the first matrix, which I write $pJ+qK$ with $K=J^T$. Define $D={\rm diag}(1,a,a^2,\ldots,a^{n-1})$. Then $D^{-1}JD=a^{-1}J$ and $D^{-1}KD=aK$. Thus, taking $a=\sqrt{p/q}$, one sees that you matrix is similar to $\sqrt{pq}(J+K)$. Its eigenvalues are $\sqrt{pq}$ times those of $J+K$. The spectrum of the latter matrix is made of numbers $2\cos\frac{k\pi}{n+1}$ for $k=1,\ldots,n$.

The second case is easy too. Eigenvectors are $n$-periodic solutions of the recursion $qu_{j+1}+pu_{j-1}=\lambda u_j$. This means that some power of $\omega=\exp\frac{2i\pi}n$ is a root of the characteristic equation $qr^2+p=\lambda r$. Whence the spectrum $\lambda_1,\ldots,\lambda_n$ $$\lambda_j=p\omega^{-j}+q\omega^j.$$


Another way to look at this problem, from the ground up, is to expand the characteristic polynomial of your first matrix, $\mathcal{T}_n(p,q)$, along the last row and then along the last column, from which you'll quickly get $$\det(\mathcal{T}_n(p,q)-\lambda)=-\lambda \det(\mathcal{T}_{n-1}(p,q)-\lambda)-pq \det(\mathcal{T}_{n-2}(p,q)-\lambda).$$ You can then compare this with the recurrence relations for the standard orthogonal polynomials; it is then rather easy to match it to that for the Chebyshev polynomials of the first kind, $$T_{n+1}(x)=2x T_n(x)-T_{n-1}(x),$$ with $T_0(x)=1$, $T_1(x)=x$,which should make it clear that the identification is $$T_n(\lambda)=\frac{1}{2(\sqrt{pq})^n}\det\left(\mathcal{T}_n(p,q)+2\sqrt{pq}\lambda\right),$$ or the equivalent $\det(\mathcal{T}_n(p,q)-\lambda)=2(\sqrt{pq})^n T_n\left(\frac{-\lambda}{2\sqrt{pq}}\right)$. Since the Chebyshev polynomials are given by $T_n(\cos(\theta))=\cos(n\theta)$, this gives all the eigenvalues. The reason this happens is that (Denis' symmetric version of) your matrix is the Jacobi matrix for the Chebyshev polynomials. You can exploit this to get the eigenvectors in terms of lower order polynomials $T_m$, $m\leq n$, evaluated at the eigenvalue; the construction is in

Gautschi, Walter. Orthogonal Polynomials, Computation and Approximation. Numerical Mathematics and Scientic Computation, Oxford University Press, 2004..


If I have read your question correctly, the second matrix is a so-called circulant matrix, and so one can read off the spectrum using known methods. Wikipedia gives you a formula that can be used.

That said, I prefer to approach things from scratch. A circulant $n\times n$ matrix can always be written as $f(S)$ where $f$ is a polynomial of degree $\leq n-1$ and $S$ is a cyclic shift matrix of order $n$. If the first column of the matrix reads $a_0, \dots, a_{n-1}$, then take $S$ to be the shift matrix $e_1\mapsto e_2 \mapsto \dots \mapsto e_n \mapsto e_1$, and take $f(z)=a_0+a_1z+ \dots + a_{n-1}z^{n-1}$.

So in your case, the matrix is just $A=pS+ qS^{n-1} = pS + qS^{-1}$, and since we know the eigenvalues of $S$ (they are the $n$ distinct complex $n$th roots of unity) and corresponding one-dimensional eigenspaces, this allows us to write down the eigenvalues of $A$

$$\{ p\omega^j + q\omega^{-j} : j=0,1,\dots, n-1\} \quad\quad(\omega=\exp(2\pi i/n) $$

with corresponding eigenvectors (depending on the values of $p$ and $q$ some of the eigenvalues may have non-trivial multiplicity).