Expected number of runs in a sequence of coin flips

I'd use indicator random variables. For $1\leq j\leq n-1$, let $Z_j$ take the value $1$ if the $j$th and $j+1$st coin tosses are different, and take the value $0$ otherwise. Then the number of runs is $R=1+\sum_{j=1}^{n-1}Z_j$ and its expected value is $$\mathbb{E}(R)=1+ \sum_{j=1}^{n-1}\, \mathbb{E}(Z_j)=1+(n-1) 2p(1-p).$$ I will leave the calculation $\mathbb{E}(Z_j)=2p(1-p)$ as an exercise!


Using indicator random variable is a good way to solve this question (see the solution above). What's more, we can calculate the variance of the number of runs:$$Var\left( R \right) =Var\left( \sum _{ j=1 }^{ n-1 }{ { Z }_{ j } } \right) =E\left\{ { \left( \sum _{ j=1 }^{ n-1 }{ { Z }_{ j } } \right) }^{ 2 } \right\} -{ E\left( \sum _{ j=1 }^{ n-1 }{ { Z }_{ j } } \right) }^{ 2 } $$ $Z_j$ is a Bernoulli distribution, $Z_j$ and $Z_k$ are independent if $\left| j-k \right| \ge 2$(notice that $Z_j$ and $Z_{j+1}$ are dependent).

We can get $Var\left( R \right)=2pq\left( 2n-3-2pq\left( 3n-5 \right) \right) $.


We already have 2 good solutions here, but wanted to answer using the conventional approach.

Let $\mathbb E(k)$ be expected number of runs in a series of $k$ coin tosses. It is important to remember that the coin tosses are memoryless. Therefore, $\mathbb E(k+1)$ depends only on the value of coin toss $k$ and $k+1$. The value of $\mathbb E(k+1)$ will increase by 1 if $k^{th}$ and $k+1^{th}$ tosses are different.

Keeping this in mind, we translate these into equations

$$ \mathbb E(k+1) = p(p\mathbb E(k) + (1-p)(1+ \mathbb E(k)) + (1-p)((1-p)\mathbb E(k) + p(1+ \mathbb E(k))) $$

Rearranging this a bit gives us

$$ \mathbb E(k+1) = 2p(1-p) + \mathbb E(k) $$

Our $ \mathbb E(k) $ is of the form $ \mathbb E(k) = Ck +D $ for some constant $C$ and $D$. The result follows after finding these constants from the previous equation and the initial condition.