Let $T\colon \mathbb{R}^2 \to \mathbb{R}^2$ be linear transformation. Show that there are $a,b\in \mathbb{R}$ such that $T^2+aT+bI=0$.

Pick any nonzero vector $v$. Then either $Tv$ is linearly independent of $v$, or $Tv=pv$ for some scalar $p$.

If $Tv$ is linearly independent of $v$, then $\{Tv,v\}$ is a basis of $\mathbb R^2$. Therefore $T^2v$ is a linear combination of $Tv$ and $v$ and $T^2v+aTv+bv=0$ for some scalars $a$ and $b$. It follows that both $(T^2+aT+bI)v$ and $(T^2+aT+bI)(Tv)=T(T^2+aT+bI)v$ are zero. That is, $T^2+aT+bI$ maps a basis of $\mathbb R^2$ to zero. Hence $T^2+aT+bI$ must be zero.

If $Tv=pv$, then $(T-pI)v=0$. Let $u$ be any vector that is linearly independent of $v$. Then $\{u,v\}$ form a basis of $\mathbb R^2$. Hence $Tu$ is a linear combination of $u$ and $v$ and $Tu=qu+rv$ for some scalars $q$ and $r$. It follows that both $(T-pI)(T-qI)u=(T-pI)(rv)$ and $(T-pI)(T-qI)v=(T-qI)\left((T-pI)v\right)$ are zero. That is, $(T-pI)(T-qI)$ maps a basis of $\mathbb R^2$ to zero. Hence $(T-pI)(T-qI)$ must be zero. Expand the product, we get $T^2+aT+bI=0$ for some scalars $a$ and $b$.

Edit. If you are comfortable with the concept of minimal polynomial, a better proof is to show that the minimal polynomial of a linear operator $T$ on an $n$-dimensional vector space is at most of degree $n$. See this answer for instance.


Let $f(\lambda) = \det(T-\lambda I)$ be the characteristic polynomial of T. Then $f(T)=0$.


With a "wink-wink-nudge-nudge" to Theo Bendit, respecting his comment on Robert Shore's answer, follows what is basically a variant of the "brute force" method, but one which illustrates the importance of the trace and determinant:

$\Bbb R^2$ is possessed of the standard basis $(1, 0)^T$, $(0, 1)^T$; in this basis we may represent $T$ by a matrix

$T = \begin{bmatrix} t_{11} & t_{12} \\ t_{21} & t_{22} \end{bmatrix}; \tag 1$

then

$T^2 = \begin{bmatrix} t_{11} & t_{12} \\ t_{21} & t_{22} \end{bmatrix}\begin{bmatrix} t_{11} & t_{12} \\ t_{21} & t_{22} \end{bmatrix} = \begin{bmatrix} t_{11}^2 + t_{12}t_{21} & t_{11}t_{12} + t_{12}t_{22} \\ t_{21}t_{11} + t_{22}t_{21} & t_{12}t_{21} + t_{22}^2 \end{bmatrix}; \tag 2$

we "define" the two real numbers

$\det (T) = t_{11} t_{22} - t_{12}t_{21}; \tag 3$

$\text{Tr}(T) = t_{11} + t_{22}; \tag 4$

then

$T^2 + (\det (T)) I = \begin{bmatrix} t_{11}^2 + t_{11}t_{22} & t_{11}t_{12} + t_{12}t_{22} \\ t_{21}t_{11} + t_{22}t_{21} & t_{11}t_{22} + t_{22}^2 \end{bmatrix}$ $= \begin{bmatrix} t_{11}(t_{11} + t_{22}) & t_{12}(t_{11} + t_{22}) \\ t_{21}(t_{11} + t_{22}) & t_{22}(t_{11} + t_{22}) \end{bmatrix} = (\text{Tr}(T)) T, \tag 5$

whence $T^2 - (\text{Tr}(T))T + (\det (T)) I = 0. \tag 6$