When we raise a matrix, whose rows or columns sum to 1, to the power of 2, 3, ..., n, why does the resultant matrix tend to stabilize after some n?

This does not always happen.

Consider $A=\begin{pmatrix}0&1\\1&0\end{pmatrix}$. Then $A^{n}=I$ for $n$ even and $A^n=A$ for $n$ odd.

Consider $C=\begin{pmatrix}2&-1\\-1&2\end{pmatrix}$. Then $C^n=\begin{pmatrix}\frac{3^n+1}2&\frac{1-3^n}2\\\frac{1-3^n}2&\frac{3^n+1}2\end{pmatrix}$, i.e., all matrix entries grow to $\pm \infty$.

The difference to your examples is this: While both $A$ and $C$ have an eigenvector of eigenvalue $1$, i.e., there exists $v\ne 0$ with $Av=v$ (resp. $Cv=v$), this is not the strictly largest in absolute value: We find $w\ne 0$ with $Aw=-w$ (so an eigenvector of eigenvalue $-1$, which is just as large as $1$ in absolute value), resp. with $Cw=3w$, which is much larger than $1$. In fact $v=\begin{pmatrix}1\\1\end{pmatrix}$ and $w=\begin{pmatrix}1\\-1\end{pmatrix}$ happen to have the described properties for both $A$ and $C$.

While the occurrence of the same $w$ is somewhat arbitrary (or because I didn't bother making up more complicated examples), the occurrence of the same $v$ in my examples is not just by chance: The very fact that the row-sums are all equal to $1$ means precisely that multiplication of the matrix with the all-entries-are-one vector produces the all-entries-are-one vector. (So $v$ is also of eigenvalue $1$ for your example matrix $B$). What makes your example different is that the other eigenvalue (yes, there is another) is strictly smaller in absolute value than $1$ (without exhibiting the corresponding eigenvector, I can see immediately that the eigenvalue is $0.45$ - be surprised by the my magic or go ahead and find somewhere to learn more about eigenvalues and stuff ;) )

The interesting point is that $v,w$ form a basis of $\Bbb R^2$, that is any vector $u$ can be written as $u=av+bw$. It follows that $C^nu=C^n(av+bw)=aC^nv+bC^nw=a1^nv+b3^nw$. Hence unless $b=0$ (i.e., unless $u$ is exactly a multiple of $v$), the summand $b3^nw$ will sooner or later dominate and $C^nu$ will start growing $\sim 3^n$. In fact, you may notice that instead of $C^n$ itself, the matrix $\frac1{3^n}C^n$ "stabilizes" after a few steps. Similarly, we find $A^nu = av+b(-1)^nw$, which will always flip between two values. But for your example matrix $B$ we have (with a different $w$ that I am too lazy to compute) $B^nu=av+b\cdot 0.45^n w$. This time $|0.45|<1$ implies that the $w$-part will tend to zero as $n$ grows, i.e., for large $n$ the effect of $B^n$ on any vector is approximately that it gets mapped to its share of $v$; as this approximation does not depend on $n$, all $B^n$ (for large enough $n$) are nearly equal.


These matrices are called Stochastic matrices, when you learn more about matrices you will hear about something they have which is called eigenvalues. These stochastic matrices have eigenvalues which one is $1$ and the others have modulus (absolute value) $ \leq 1$. This means that when multiplying with the matrix, one particular (column) vector will be preserved and all others will either remain or shrink in magnitude, but never grow. For stochastic matrices the vector which is preserved (has eigenvalue 1) is called the steady state vector.


There are many interpretations or usages of such matrices, what the sum equals $1$ can be interpreted as is inside parentheses

  • Approximate diffusion in sciences ( matter / energy is indestructible so the total is preserved )
  • Calculate probabilities ( sums of probabilities of all events = $1$ ).

Note: Steady state vector is in general not the vector full of ones.

We need to go into left- and right- eigenvectors to clear this out, but it's probably a bit too overkill.