Is there a way to determine the eigenvectors of a matrix without working out the eigenvalues?

The power method uses the fact that from an arbitrary vector, we have some decomposition

$$\vec v=v_1\vec e_1+v_2\vec e_2+\cdots v_n\vec e_n$$ in terms of the Eigenvectors, and after $m$ applications of the matrix,

$$A^m\vec v=\lambda_1^mv_1\vec e_1+\lambda_2^mv_2\vec e_2+\cdots \lambda_n^mv_n\vec e_n.$$

If the largest Eigenvalue, let $\lambda_1$, is simple,

$$A^m\vec v\approx\lambda_1^mv_1\vec e_1$$ and $A^m\vec v$ tends to an Eigenvector.

(In practice you take high powers by successive squarings of $A$ and rescale the intermediate results to avoid overflow.)


The method does not give you the Eigenvalue directly, you can compute

$$\vec u_1=\frac{A^m\vec v_1}{\|A^m\vec v_1\|}$$ and $$\lambda_1=\vec u_1^TA\vec u_1.$$


Obtaining the next Eigenvectors is possible, provided that you remove from $\vec v$ the contributions of the preceding Eigenvectors, by projection (as in a Gram-Schmidt process). You have to do this periodically, because the numerical errors will make the Eigenvectors corresponding to the large Eigenvalues resurface.


Most of the iterative numerical methods produce approximations to eigenvectors and eigenvalues at the same time.