Limit of powers of $3\times3$ matrix

If you are in $1$, you have same probability to stay there or to pass to $2$, but no way to get back from there. Thus you are finally drifting to $2$.

States $2$ and $3$ are symmetrical: at long they will tend to be equally populated, independently of the starting conditions.

Therefore also starting from $1$ you will at long be split between $2$ and $3$.

Thus the answer is D).


By this question, we know that

\begin{equation} A^n= \begin{pmatrix} 2^{-n} & n\cdot 2^{-n-1} - 2^{-n-1} + \frac12 & {1-\frac{n+1}{2^n}\over2}\\ 0 & {2^{-n}+1\over2} & {1-2^{-n}\over2} \\ 0 & {1-2^{-n}\over2} & {2^{-n}+1\over2} \end{pmatrix}. \end{equation}

It is thus clear that $\lim_{n\to\infty} A^n = \begin{pmatrix} 0 &\frac{1}{2} & \frac{1}{2}\\ 0 & \frac{1}{2} & \frac{1}{2}\\ 0 & \frac{1}{2} & \frac{1}{2}\end{pmatrix}$.


I’m lazy and prefer not to do tedious matrix inversions and multiplications if I can avoid it. Other answers have explained how to quickly eliminate the given possible solutions based on properties of Markov chains and their associated transition matrices, but one can also reason directly from the eigenvalues of the matrix.

It’s often worth examining a matrix for obvious eigenvectors and eigenvalues, especially in artificial exercises, before plunging into computing and solving the characteristic equation. From the first column of $A$, we see that $(1,0,0)^T$ is an eigenvector with eigenvalue $\frac12$. The rows of $A$ all sum to $1$, so $(1,1,1)$ is an eigenvector with eigenvalue $1$. The remaining eigenvalue $\frac12$ can be found by examining the trace.

$A$ is therefore similar to a matrix of the form $J=D+N$, where $D=\operatorname{diag}\left(1,\frac12,\frac12\right)$ and $N$ is nilpotent of order no greater than 2. (If $A$ is diagonalizable, then $N=0$.) $D$ and $N$ commute, so expanding via the Binomial Theorem, $(D+N)^n=D^n+nND^{n-1}$. In the limit, $D^n=\operatorname{diag}(1,0,0)$ and the first column of $N$ is zero, so the second term vanishes. Thus, if $A=PJP^{-1}$, then $\lim_{n\to\infty}A^n=P\operatorname{diag}(1,0,0)P^{-1}$, but the right-hand side is just the projector onto the eigenspace of $1$. Informally, repeatedly multiplying a vector by $A$ leaves that vector’s component in the direction of $(1,1,1)^T$ fixed, while the remainder of the vector eventually dwindles away to nothing.

Since $1$ is a simple eigenvalue, there’s a shortcut for computing this projector that doesn’t require computing the change-of-basis matrix $P$: if $\mathbf u^T$ is a left eigenvector of $1$ and $\mathbf v$ a right eigenvector, then the projector onto the right eigenspace of $1$ is $${\mathbf v\mathbf u^T\over\mathbf u^T\mathbf v}.$$ (This formula is related to the fact that left and right eigenvectors with different eigenvalues are orthogonal.) We already have a right eigenvector, and a left eigenvector is easily found by inspection: the last two columns both sum to $1$, so $(0,1,1)$ is a left eigenvector of $1$. This gives us $$\lim_{n\to\infty}A^n = \frac12\begin{bmatrix}1\\1\\1\end{bmatrix}\begin{bmatrix}0&1&1\end{bmatrix} = \begin{bmatrix}0&\frac12&\frac12\\0&\frac12&\frac12\\0&\frac12&\frac12\end{bmatrix}.$$

Tags:

Matrices