Throwing a fair die until most recent roll is smaller than previous one

By doing casework on the second to last roll $m$, one has $$E_n = \sum_{k=2}^\infty k p_k^{(n)},$$ where $$p_k^{(n)} = \sum_{m=1}^n n^{-(k-1)}{m+k-3 \choose k-2}\frac{m-1}{n}.$$ Note, for any fixed $k$, $$\lim_{n \to \infty} p_k^{(n)} = \frac{k-1}{k!}.$$ Therefore, by an easy dominated convergence argument, one has $$\lim_{n \to \infty} E_n = \sum_{k=2}^\infty k\frac{k-1}{k!} = e.$$


The answer may be expressed more simply, in fact $E_n = \left( \frac{n}{n-1}\right)^n$.

Update 1: (The following was independently obtained by Pierre PC, I just found out after I finished typing.) The following is a simple way to see this. Let $E_{n,\ell}$ denote the expected number of rolls if the last roll is $\ell$. We then have the formula \begin{equation*} E_{n,\ell} = \underbrace{1\cdot \frac{1}{n} + \dotsb + 1\cdot \frac{1}{n}}_{\ell-1} + \sum_{k=\ell}^n \left(1+E_{n,k}\right)\cdot \frac{1}{n}. \end{equation*} Solving these backwards (i.e. solve for $E_{n,n}, E_{n,n-1}, \dotsc, E_{n,1}$) gives $E_{n,\ell}=\left( \frac{n}{n-1}\right)^{n-\ell+1}$. It then follows that \begin{equation*} E_n = 1 + \sum_{\ell=1}^n E_{n,\ell}\cdot \frac{1}{n} = \left( \frac{n}{n-1}\right)^n. \end{equation*}

Update 2: Here is a derivation starting from mathworker21's formula. We have \begin{align*} E_n & {}= \sum_{k=2}^{\infty} k \sum_{m=1}^n n^{-(k-1)} \binom{m+k-3}{k-2} \frac{m-1}{n}\\ & {}= \sum_{m=1}^n (m-1) \sum_{k=2}^{\infty} k \binom{m+k-3}{k-2} n^{-k}\\ & {}= \sum_{m=1}^n \frac{m-1}{n^2} \sum_{k=0}^{\infty} (k+2) \binom{m+k-1}{k} n^{-k}\\ & {}= \sum_{m=1}^n \frac{m-1}{n^2} \sum_{k=0}^{\infty} (k+2) \binom{-m}{k} \left( -\frac{1}{n} \right)^k.\\ \end{align*} Now massaging the sum and using the binomial series twice gives \begin{align*} \sum_{k=0}^{\infty} (k+2) \binom{-m}{k} x^k & {}= 2 \sum_{k=0}^{\infty} \binom{-m}{k} x^k + \sum_{k=1}^{\infty} k \binom{-m}{k} x^k\\ & {}= 2 (1+x)^{-m} + \sum_{k=1}^{\infty} (-m) \binom{-(m+1)}{k-1} x^k\\ & {}= 2 (1+x)^{-m} - m x (1+x)^{-(m+1)}\\ & {}= \frac{2-(m-2)x}{(1+x)^{m+1}}. \end{align*} Inserting $x=-\frac{1}{n}$ then gives \begin{align*} E_n & {}= \sum_{m=1}^n \frac{m-1}{n^2} \left( \frac{2+\frac{m-2}{n}}{(1-\frac{1}{n})^{m+1}}\right)\\ & {}= \frac{1}{n^2 (n-1)} \sum_{m=1}^n (m-1)(2 n+m-2) \left( \frac{n}{n-1} \right)^m\\ & {}= -\frac{2}{n^2} \left(\sum_{m=1}^n \left( \frac{n}{n-1} \right)^m\right) + \frac{2 n-3}{n^2 (n-1)} \left(\sum_{m=1}^n m \left( \frac{n}{n-1} \right)^m\right) + \frac{1}{n^2 (n-1)} \left(\sum_{m=1}^n m^2 \left( \frac{n}{n-1} \right)^m\right).\\ \end{align*} Finally using \begin{equation*} \sum_{m=1}^n x^m = \frac{x}{1-x} \left( 1-x^n \right), \qquad \sum_{m=1}^n m x^m = \frac{x}{(1-x)^2} \left( n x^{n+1} - (n+1) x^n + 1\right) \qquad\text{ and }\qquad \sum_{m=1}^n m^2 x^m = \frac{x}{(1-x)^3} \left( (1+x) - x^n \left( n^2 (1-x)^2 +2 n (1-x) + x + 1 \right) \right) \end{equation*} one then obtains $E_n=\left(\frac{n}{n-1}\right)^n$ after heavy simplification.


Write $\alpha_i$ for the expected time starting from “something then $i$”. Then $$ \alpha_i = \frac1n(\alpha_i+1) + \dotsb + \frac1n(\alpha_n+1) + \frac{i-1}n \cdot 1 = 1 + \frac{\alpha_i + \cdots + \alpha_n}n. $$ From this one can deduce $\alpha_i = \left(\frac{n}{n-1}\right)^{n-i+1}$, and $$ E_n = \frac{\alpha_1}n+\cdots+\frac{\alpha_n}n = \left(\frac n{n-1}\right)^n $$ as explained by Kasper Andersen. Of course the limit is $e$.

The way I got the expression for $\alpha_i$ is setting $\beta_i = \alpha_i/n$ and realising that $$ (n-1)(\beta_{i-1} - \beta_i) = \left(\beta_i+\dotsb+\beta_n+1\right) - \left(\beta_{i+1}+\cdots+\beta_n+1\right) = \beta_i, $$ from which we deduce $\beta_{i-1} = \frac n{n-1}\beta_i$, then from $\alpha_n = 1+\alpha_n/n$ we get $\beta_n = 1/(n-1)$ and $\beta_i=\frac1n\left(\frac n{n-1}\right)^{n-i+1}$.