How are eigenvectors/eigenvalues and differential equations connected?

You could think of eigenvectors as giving you the coordinate system in which your problem has the simplest form. It's analogous to working physical problems by turning your coordinate system the right way to make some term go away.


Suppose $\sum a_n y^{(n)} = 0$ is a homogeneous linear differential equation with constant coefficients. Let $D$ be the linear operator defined on, say, the space $C^{\infty}(\mathbb{R})$ of smooth functions $\mathbb{R} \to \mathbb{R}$ which sends a function $y$ to its derivative $y'$. Then the solutions to the above differential equation are the nullspace of the operator $\sum a_n D^n$.

It turns out that this nullspace is finite-dimensional, so we can use finite-dimensional linear algebra on it. It is not just a vector space, though: it comes equipped with an action of $D$, which commutes with $\sum a_n D^n$, and which therefore splits the nullspace into eigenspaces of $D$. In the simplest case, these eigenspaces will all be one-dimensional. Now, the eigenvectors of $D$ are precisely the functions satisfying $Dy = \lambda y$ for some eigenvalue $\lambda$, which are precisely the exponential functions $y = e^{\lambda t}$. And it's not hard to see that such a function is a solution of the differential equation if and only if $\lambda$ is a root of the characteristic polynomial $\sum a_n \lambda^n$.

Example. Simple harmonic motion is governed by the differential equation $m D^2 + k D^0 = 0$, which has characteristic polynomial $m \lambda^2 + k = 0$. The roots of this polynomial are complex, but if we allow ourselves to work with complex numbers (formally, in the above situation we tensor with $\mathbb{C}$) we find that the eigenvalues are $\lambda = \pm i \sqrt{ \frac{k}{m} }$, so the set of solutions is all functions of the form

$$x = A e^{ i \sqrt{ \frac{k}{m} } t } + B e^{-i \sqrt{ \frac{k}{m} } t}.$$

By Euler's formula, if we restrict our solutions to be real we get the familiar periodic sine and cosine.

In general the eigenspaces will not be one-dimensional and then the theory of Jordan normal form applies. This occurs, for example, when finding the general form of damped harmonic motion.


If you have a linear system on matrix form, $dX/dt=AX$, where $X(t)$ is a vector in $\mathbf{R}^n$ and $A$ is an $n\times n$ constant real matrix, then $X(t) = \exp(\lambda t) V$ is a solution to the system if $V$ is an eigenvector of $A$ with eigenvalue $\lambda$. (This works since $\exp(\lambda t)$ is an eigenfunction of the differential operator $d/dt$ with eigenvalue $\lambda$.)

If $A$ is diagonalizable, then the general solution is a linear combination of terms like above: $X(t)=\sum_{k=1}^n c_k \exp(\lambda_k t) V_k$. (If the eigenvalues are complex, this is the general complex-valued solution; one can perform some standard tricks to extract the general real-valued solution from that.) Nondiagonalizable matrices $A$ cause a bit more trouble, but that can be handled too.

(Of course, this is related to what has been said in other answers already, but with a system of this type one sees clearly that the eigenvectors play a role too, not just the eigenvalues.)