Expected number of dice rolls before rolling "1,2,3,4,5,6"

Solving a set of linear recurrences is indeed a good, elementary way to go, but if you solve the recurrences in the answer by @Canardini - which I did using wolfram alpha - you find that the answer is $E_X = 46656 = 6^6$. This is such a special number that you might wonder if there is a more fundamental explanation, and indeed there is, using more powerful theorems of Markov Chains.

Claim: If the desired string $x$ has the property that two copies of $x$ cannot overlap (which holds for $x = 123456$ in the OP question but does not hold for e.g. $x=111111$ or $x=121212$), then the expected time to first occurrence of $x$ is $6^L$ where $L$ is the length of $x$.

Consider a Markov Chain with $6^6$ states, where each state is a possible sequence in $\{1,2,3,4,5,6\}^6$ and records the last $6$ rolls. Each state can transition to $6$ states (i.e. it has "out-degree" $6$) with equal prob $1/6$. E.g. the state $\color{red}{1}13462$ can transition to $13462\color{blue}{j}$ where $\color{blue}{j}$ can be any of $\{1,2,3,4,5,6\}$. The red $\color{red}{1}$ represents the oldest die-roll result that has "aged out" and the blue $\color{blue}{j}$ represents the newest die-roll result. Note that each state also has "in-degree" $6$, i.e. only $6$ states can transition to it. (Self-loops are possible and count as both in-degree and out-degree.)

It is obvious such a Markov Chain is aperiodic, positive recurrent, irreducible, ergodic, etc., all the good stuff. Further, because every state's in-degree $=$ out-degree $= 6$, the chain's unique stationary distribution $\pi$ (also its limiting distribution) is the $6^6$-long vector whose every entry is $6^{-6}$.

A powerful (but somewhat "intuitively obvious?") theorem says that, if $\tau_{xx}$ is the revisit time from state $x$ back to state $x$, then:

Theorem: for a positive recurrent Markov Chain, with stationary distribution $\pi, E[\tau_{xx}] = 1 / \pi_x$ for any state $x$.

E.g. see Prop 2.6 of these notes or Theorem 1.19 of these notes or (for a slightly different version) wikipedia

IMHO this theorem is "intuitively obvious" in the following sense: the limiting distribution $\pi$ means in the long run the chain is going to spend $\pi_x$ fraction of the time in state $x$, so it only makes sense that the inter-visit time $\tau_{xx}$ has an expected value of $1/\pi_x$. However, such an "intuitive" argument is not rigorous, and the theorem has a non-trivial proof making use of positive recurrence.

Anyway, based on this theorem, and letting $x=123456$ the state we're interested in, we have $E[\tau_{xx}] = 1/6^{-6} = 6^6$. I.e., if we have just rolled $123456$, then the expected time to roll the next $123456$ is $6^6$. This isn't the same as the OP question. However, if we have just rolled $123456$, then none of these old roll-results can be part of the next $123456$, and therefore this is equivalent to rolling from the very beginning (when the "history" of rolls is the empty string). This is a direct result of the fact that two strings of $123456$ cannot overlap. So the same expected time $6^6$ also answers the OP question.


Addendum: for some other strings, this theorem also gives a quick way to find expected time of first occurrence. E.g. consider $y=111111$. The same theorem says that $E[\tau_{yy}] = 6^6$. But it is also obvious that revisit can either happen right away (if the next roll is $1$) or much later. I.e.:

$$E[\tau_{yy}] = 1 + (\frac16 \times 0 + \frac56 \times E[T_y])$$

where $T_y=$ time to first occurrence of $y$ starting with no useful history (including the case of starting from scratch, i.e. empty history). Solving for this we have:

$$E[T_y] = (6^6 - 1) \times \frac65 = 55986$$

which can be easily verified by solving the corresponding set of linear recurrences for the string $y=111111$.


Hint :

Picture it as a Markov chain. You start at the state $X$ aka " I failed at getting the string $"123456"$.

The next state is $1$, otherwise I go back to state $X$. If I am at state $1$, the next state is $2$, otherwise I fail to construct the string. In the late case, either you got a $1$ and you do not start from zero, or you got $3,4,5$ or $6$.

Same logic for state $2,3,4,5$.

Let $E_m$ define the expected number of rolls needed from state $m$ to get the string $123456$.

Trivially, $E_6=0$.

$$E_X=1+\frac{1}{6}E_1+\frac{5}{6}E_X$$ $$E_1=1+\frac{1}{6}E_1+\frac{4}{6}E_X+\frac{1}{6}E_2$$ $$E_2=1+\frac{1}{6}E_1+\frac{4}{6}E_X+\frac{1}{6}E_3$$ $$E_3=1+\frac{1}{6}E_1+\frac{4}{6}E_X+\frac{1}{6}E_4$$ $$E_4=1+\frac{1}{6}E_1+\frac{4}{6}E_X+\frac{1}{6}E_5$$ $$E_5=1+\frac{1}{6}E_1+\frac{4}{6}E_X+\frac{1}{6}E_6$$

You solve that system of equations, and your answer is $E_X$.


Usually we model the situation with a Markov chain with the states as in the following picture:

       1/6      1/6       1/6        1/6         1/6          1/6
  (*) -->-- *1 -->-- *12 -->-- *123 -->-- *1234 -->-- *12345 -->-- [*123456]
Initial                                                              Final

   0         1         2          3           4            5              6

and there are also arrows going backwards with corresponding probabilities to be extracted from the following Markov matrix of the process: $$ A= \begin{bmatrix} 5/6 & 1/6 \\ 4/6 & 1/6 & 1/6 \\ 4/6 & 1/6 & & 1/6 \\ 4/6 & 1/6 & & & 1/6 \\ 4/6 & 1/6 & & & & 1/6 \\ 4/6 & 1/6 & & & & & 1/6 \\ & & & & & & 1 \\ \end{bmatrix} \ . $$ (The state $6$ was made absorbant. This does not matter for us.)

Above, $*$ is a replacement for "any word (string, including the empty one) not ending in $1$". We also use $0,1,2,3,4,5,6,$ instead to have a simpler notation. Since the first notation that comes now is $s_k$ for the expected number of steps to start in $k=*\dots k$ (well, $0=*$,) and end in $6=*123456$. Of course, $s_6=0$. We have the obvious markovian system of equations: $$ \left\{ \begin{aligned} s_0 \color{red}-1 &= \frac 56s_0+\frac 16s_1\\ s_1 \color{red}-1 &= \frac 46s_0+\frac 16s_1+\frac 16s_2\\ s_2 \color{red}-1 &= \frac 46s_0+\frac 16s_1\qquad +\frac 16s_3\\ s_3 \color{red}-1 &= \frac 46s_0+\frac 16s_1\qquad\qquad+\frac 16s_4\\ s_4 \color{red}-1 &= \frac 46s_0+\frac 16s_1\qquad\qquad\qquad+\frac 16s_5\\ s_5 \color{red}-1 &= \frac 46s_0+\frac 16s_1\qquad\qquad\qquad\qquad+\frac 16s_6\\ s_6 &= 0 \end{aligned} \right. $$


Later edit: Corrected and completed answer. (After the holidays, now we have the usual general relativity theories governing time and space.)

The first equation corresponds to the following thoughts. Suppose we are in the state $0=*$. There are $s_0>0$ steps till we reach the final state $6=*123456$. So let us make one (imaginary) step. We land

  • with probability $5/6$ again in $0$, from where we further need in mean $s_0$ steps, and
  • with probability $1/6$ in $1$, from where we further need in mean $s_1$ steps.

So after the imaginary step we need in mean $\frac 56s_0+\frac 16s_1$ steps. This corresponds to $s_0\color{red}-1$. The other equations have similar markovian motivations.

Ths solution of the system is $$ \begin{aligned} s_0 &= 6^6 = 46656\ ,\\ s_1 &= 6^6 - 6^1= 46650\ ,\\ s_2 &= 6^6 - 6^2= 46620\ ,\\ s_3 &= 6^6 - 6^2= 46440\ ,\\ s_4 &= 6^6 - 6^2= 45360\ ,\\ s_5 &= 6^6 - 6^5= 38880\ ,\\ s_6 &= 6^6 - 6^6= 0\ . \end{aligned} $$ So we need in mean $6^6$ steps from the initial state till the final state. As by-product of the computation we also obtain the information that there are in mean $6^6-6^k$ steps, if we would start from the state $k=*12\dots k$ till reaching the final $6=*123456$.


(Please ignore the following if annoying.)

Here is a slow simulation using python / numpy / sage:

import numpy as np
d = np.random.random_integers(1, 6, 6^9)    # 6^9 times rolling dices in an array
e = np.stack( [d[0:-5], d[1:-4], d[2:-3], d[3:-2], d[4:-1], d[5:]] )

patterns, count = np.unique(e, axis=1, return_counts=True)
N = 6^4 + 2*6^3 + 3*6^2 + 4*6 + 5
patterns[:, N]
count[N]

Results this time:

array([1, 2, 3, 4, 5, 6])
212

So in a long string of length $6^9$ we have the pattern array([1, 2, 3, 4, 5, 6]) some $212$ times, this is close to $6^3$, so we expect a mean near $6^6=6^9/6^3$.