What is the importance of the spectral theorem?

Write down a square matrix, $A$. Now, raise it to the power 100. Not so easy, is it? Well, it is if the matrix is diagonal. It's also easy if the matrix is diagonalizable; if $P^{-1}AP=D$ is diagonal, then $A^{100}=PD^{100}P^{-1}$. So, computing high powers of matrices is made easy by diagonalization.

And why would you want to compute high powers of a matrix? Well, many things are modelled by discrete linear dynamical systems, which is a fancy way of saying you have a sequence of vectors $v_0,v_1,v_2,\dots$ where you get each vector (after the first) by multiplying the previous vector by $A$. But then $v_k=A^kv_0$, and voila! there's your high power of a matrix.


Thanks to the spectral theorem you can easily compute the powers of a given diagonalizable matrix $A = P^{-1}D P$ (as @Gerry Myerson said), thanks to the relation $$A^k = P^{-1}D^k P.$$ But you can also exploit the diagonalization to evaluate more complex matrix functions, e.g. the matrix exponential. Indeed you have $$e^A = \sum_{k=0}^\infty \frac{A^k}{k!} = P^{-1}\sum_{k=0}^\infty \frac{D^k}{k!} P = P^{-1}e^D P,$$ and for diagonal matrices $D = \text{diag}(d_1,\ldots,d_n)$ we have the simple relation $$e^D = \text{diag}(e^{d_1},\ldots,e^{d_n}).$$ This holds in general for any matrix function defined as a power series, indeed if you have $f(x)=\sum_{k=0}^\infty c_k x^k$, then $$f(A) = P^{-1} f(D) P,$$ where $f(D) = \text{diag}(f(d_1),\ldots,f(d_n)).$

[EDIT: notice that the matrix exponential is extremely important, because thanks to this tool you can e.g. solve linear differential systems of equations $\dot{\mathbf{x}}(t) = A(t) \mathbf{x}(t) + \mathbf{b}(t)$.]