Finding the rank of the matrix directly from eigenvalues

The rank of a matrix is defined as the dimension of the column space. Let $r$ denote the rank of a matrix, $A$, defined over a finite-dimensional vector space, $V$.

The rank theorem (sometimes called the rank-nullity theorem) relates the rank of a matrix to the dimension of its Null space (sometimes called Kernel), by the relation: $\mathrm{dim} V = r + \mathrm{dim ~ Null } A$

But how does this relate to eigenvalues and eigenvectors?

Recall a definition of an eigenvalue, $\lambda$ is an eigenvalue of a linear transformation, $T$, if and only if there exists a nonzero $v \in V$ such that $Tv = \lambda v \iff (T- \lambda I)v = 0$.

Recall the definition of an eigenvector, $v$ is an eigenvector of the linear transformation $T$ with eigenvalue $\lambda$ if and only if there exists an eigenvalue $\lambda$ such that $Tv = \lambda v \iff (T- \lambda I)v = 0$

Thus it makes sense to define the eigenspace as the set of all eigenvectors (which happens to be a subspace), i.e., the set of all vetors such that $(T- \lambda I)v = 0$, which is just the null space of $(T- \lambda I)$. So: $\mathrm{E}(\lambda, T) = \mathrm{Null}(T- \lambda I)$.

This shows you the relationship between eigenspaces and null spaces. How do you relate this back to the null space of $T$? Well, take $\lambda = 0$. Then:

$$E(0, T) = \mathrm{Null} ~ T$$

So the null space of $T$ is exactly the dimension of the eigenspace corresponding to eigenvalue $0$. By the rank theorem, we relate this back to the rank of the matrix:

$$\mathrm{rank}(T) = \mathrm{dim}V - \mathrm{dim ~ E}(0, T)$$

Now, a few common misconceptions often appear. Many people will initially think that the dimension of the eigenspace is equal to the (algebraic) multiplicity of the eigenvalue, but this is not true. Consider:

$$B = \begin{bmatrix} 0 & 1&0 \\0&0&1\\0&0&0\end{bmatrix}$$

For this matrix, the eigenvalue $0$ has an algebraic multiplicty of $3$, but the dimension of the eigenspace corresponding to $0$ (And thus the null space of this matrix) is only 1:

$\mathrm{Null}B = \mathrm{span}\left(\begin{bmatrix} 1\\0\\0 \end{bmatrix}\right)$.

So to calculate the dimension of the eigenspace corresponding to eigenvalue $0$, you cannot just count the number of times $0$ is an eigenvalue, you must find a basis for $Null(A)$ and then see how long the basis is, determining the dimension of the null space. From there, you can get the rank from the rank theorem.

But we do have an upper bound, the dimension of each eigenspace (geometric multiplicity) cannot be larger than the (algebraic) multiplicity of the eigenvalue in general. See here for why.

In your example, $0$ is an eigenvalue, which means there exists a nonzero eigenvector, $v$, such that $Tv=0$, so the dimension of the null space is at least $1$. We cannot have the dimension of the null space larger than $1$, because the geometric multiplicity is less than or equal to the algebraic multiplicity, and you were given the algebraic multiplicity of $0$ is $1$.

This means that the dimension of the eigenspace corresponding to eigenvalue $0$ is at least $1$ and less than or equal to $1$. Thus the only possibility is that the dimension of the eigenspace corresponding to $0$ is exactly $1$. Thus the dimension of the null space is $1$, thus by the rank theorem the rank is $2$. It cannot be less than 2, because the dimension of the eigenspace corresponding to $0$ cannot be greater than the algebraic multiplicity, forcing the rank to be $2$.

Moving on, the way you seem to be thinking about this problem is not the best, but I will attempt to answer some of your questions you asked regarding this in order to fully answer your question.

First, on linearly independent eigenvectors:

Eigenvectors with distinct eigenvalues are always linearly independent. I'll show you a proof of the following theorem for you (I cite Axler's Linear Algebra Done Right for much of this proof):

Let $\lambda_1, ... , \lambda_m$ be distinct eigenvalues of $T$ where $v_1, ... , v_m$ are corresponding nonzero eigenvectors, then $v_1, ... , v_m$ is linearly independent.

We will proceed by contradiction; Assume the list is linearly dependent:

Let $k$ be the smallest integer such that $v_1, ... v_{k-1}$ is linearly independent. Then adding $v_k$ makes the list linearly dependent, so we can write:

$v_k = a_1v_1 + ... + a_{k-1}v_{k-1}$ for some $a_1, ... , a_{k-1}$, or

$\lambda_kv_k = \lambda_k(a_1v_1 + ... + a_{k-1}v_{k-1}) ~~~~~~~~~~~~~~~~~~~~~$(*)

(multiplying both sides by $\lambda_k$)

Apply $T$ to both sides of the first equation (recall each $v_k$ is an eigenvector):

$$\lambda_k v_k = \lambda_1a_1v_1 + ... + \lambda_{k-1}v_{k-1}$$

Subtracting this equation from (*) produces:

$$0 = a_1(\lambda_k-\lambda_1)v_1 + ... + a_{k-1}(\lambda_k-\lambda_{k-1})v_{k-1}$$

Since $v_1, ... , v_{k-1}$ are linearly independent by assumption, we must have that all $a_i$ are zero (since no $\lambda_k-\lambda_i$ is zero because we assumed the eigenvalues are distinct). Then that means $v_k = 0$, but we assumed $v_k$ was nonzero. This is the contradiction we need, so we must have that $v_1, ... , v_m$ is linearly independent.

So, eigenvectors with distinct eigenvalues are linearly independent. Eigenvectors with the same eigenvalue may be linearly dependent (Let $v$ be an eigenvector, then $av$ is also an eigenvector ($a \in \mathbb{F}$) because $T(av) = aT(v) = a\lambda v$). But as I said above, by calculating the dimension of the null space, you can determine the dimension of the eigenspace and thus the number of linearly independent eigenvectors corresponding to the same eigenvalue that you can have.

I'm not quite sure what you mean by "Does [sic] distinct eigenvalues always form the same eigenvector?" but I think this proof of linearly independence of eigenvectors with distinct eigevalues will answer whatever concern you had.

As for why your approach didn't work, you cannot use that there are two linearly independent eigenvectors corresponding to eigenvalue $2$, and thus the dimension of the column space is $2$, because you don't know the dimension of the eigenspace is 2. You would need to calculate $\mathrm{Null}(A-2I)$. We can have a transformation where the eigenvalue 2 has multiplicity 2, but the dimension of the eigenspace is not 2. Namely:

$$\begin{bmatrix}0&0&0\\0&2&1\\0&0&2\end{bmatrix}$$


The rank is equal to the dimension of the space minus the dimension of the kernel.

The dimension of the kernel is equal to the dimension of the eigenspace for the eigenvalue $0$.