Why do complex eigenvalues correspond to a rotation of the vector?

Suppose for a moment that we have just the real vector space $V = \mathbb{R}^2$. Here is an example to see why a geometric rotation will correspond to a complex eigenvalue.

First, for a concrete example, say $M$ is a rotation by $90$ degrees. If we only consider real numbers, we can easily check that $M$ has no real eigenvalues. But suppose we want to "pretend" $M$ had some eigenvalue, $\lambda$. We don't know yet that $\lambda$ is complex, just that it can't be real.

We know there is some (nonzero) vector $x$ with $Mx = \lambda x$. Then $M^2x = \lambda^2 x$. But $M^2$ is a rotation by $180$ degrees, so $M^2 x = -x$. Hence $\lambda^2 = -1$. So $\lambda$ is a square root of $-1$ -- this immediately suggests we should look at complex numbers for $\lambda$, and in particular $\lambda$ should be be $i$ or $-i$.

Similarly, say $N$ is a rotation by $60$ degrees, which also has no real eigenvalues. Then we will have $N^3 = -I$, and so any eigenvalue of $N$ should be a non-real cube root of $-1$.

This is why rotations correspond with complex eigenvalues, in general: because iterated rotations go "around a circle" in the same way as iterated powers of complex numbers on the unit circle.


First of all, eigenvalues and eigenvectors exist only for endomorphisms of a vector space, so you need $T:\mathbb{R}^n\to\mathbb{R}^n$ (same dimension).

Now, $T$ is given by multiplication with a matrix $A$ with real entries; you can extend $T$ to a map $T':\mathbb{C}^n\to\mathbb{C}^n$ given by the same matrix: $T'(z)=Az$. The real vector space you started from can be found as $$V=\{\mathrm{Im}\; z_1=\ldots=\mathrm{Im}\; z_n=0\}$$ and it is fixed by $T'$, as $A$ has real coefficients.

Suppose $T$ has a complex eigenvalue $\lambda=a+ib$. It follows that $T'$ has the same eigenvalue; moreover, as $A$ has real coefficients, also $\bar\lambda=a-ib$ is an eigenvalue for $T$ and $T'$. Let $w\in\mathbb{C}^n$ be a (complex) eigenvector for $\lambda$, i.e. $$T'(w)=Aw=\lambda w$$ then $$T'(\bar{w})=A\bar{w}=\overline{Aw}=\overline{\lambda w}=\bar\lambda\bar{w}$$ i.e. $\bar{w}$ is an eigenvector for $\bar{\lambda}$. Consider $W=\mathrm{Span}\{w,\bar{w}\}\subseteq\mathbb{C}^n$; this is an eigenspace for $T'$, so $W\cap V$ is fixed by $T$.

It is easy to believe that $W\cap V$ is made by the linear combinations of $w$ and $w'$ (with complex coefficients) which end up being real vectors. Moreover, $\dim(W\cap V)=2$ as a real vector space: $W$ is $4$-dimensional over the reals, $V\cap W\neq W$ because $w, \bar{w}\not\in V$ (and they are linearly independent over the reals) and $V\cap W$ contains at least $w+\bar{w}$ and $i(w-\bar{w})$ which are linearly independent (over the real or complex numbers).

Hence $U=W\cap V$ is generated by $v_1=w+\bar{w}$ and $v_2=i(w-\bar{w})$ which, even if one of them contains an $i$, are with real coefficients.

Compute $$T(v_1)=T'(w+\bar{w})=T'(w)+T'(\bar{w})=\lambda w+\overline{\lambda w}$$ $$T(v_2)=i(\lambda w-\overline{\lambda w})$$

If you write $\lambda=\rho(\cos\theta+i\sin\theta)$, then $$\lambda w+\overline{\lambda w}=\rho(\cos\theta w + i \sin\theta w +\cos\theta\bar{w}-i\sin\theta\bar{w})=\rho\cos\theta (w+\bar{w}) + i\rho\sin\theta(w-\bar{w})$$ $$=\rho \cos\theta v_1 + \rho\sin\theta v_2$$ Similarly you can compute $$T(v_2)=-\rho\sin\theta v_1+\rho\cos\theta v_2$$

So, $T$ restricted to $U$, with respect to the basis $\{v_1, v_2\}$, is given by the matrix $$\rho\begin{pmatrix}\cos\theta&-\sin\theta\\\sin\theta&\cos\theta\end{pmatrix}$$ which is obtained composing an homotecy of ratio $\rho$ with a rotation of angle $\theta$.


By the way, the matix $$A=\begin{pmatrix}1&1\\0&1\end{pmatrix}$$ does not act as a composition of dilations and rotations. The matrices which behave like that are only the ones of the form $$\begin{pmatrix}a & -b\\b& a\end{pmatrix}$$ which constitute a subring of the $2\times 2$ matrices which is isomorphic to complex numbers and so, indeed, a field.


There is one interpretation: let $a+bi$ be complex eigenvalue corresponding to eigenvector $v=v_1+iv_2$, where $v_1,v_2$ are real vectors. It can be proven that $a-bi$ is also eigenvalue correcponding to other vector $w=\overline{v}=v_1-iv_2$, so by definition:

$$Tv=(a+bi)v$$

$$Tw=(a-bi)w$$

In other form:

$$Tv_1+iTv_2=(a+bi)(v_1+iv_2)=av_1-bv_2+i(av_2+bv_1)$$

$$Tv_1-iTv_2=(a-bi)(v_1-iv_2)=av_1-bv_2-i(av_2+bv_1)$$

If you add these equations side by side you get:

$$Tv_1=av_1-bv_2$$

If you first multiply second by $-1$ you get:

$$Tv_2=bv_1+av_2$$

You can see analogy to rotation matrix:

$$\frac{1}{\sqrt{a^2+b^2}}\begin{bmatrix}a && -b \\ b && a\end{bmatrix}$$