Expected number of coin tosses before $k$ heads

Let $X_1$ be the number of coin tosses until the first head, $X_2$ be the number of coin tosses after that until the second head, and so on. We want $E[X]$ where $X = X_1 + X_2 + \dots + X_k$. By linearity of expectation, we have $E[X] = E[X_1] + E[X_2] + \dots + E[X_k]$.

For any $i$, the number $E[X_i]$, the expected number of coin tosses until a head appears, is $\dfrac{1}{p}$. You can see this by calculating the mean of the geometric distribution, or by noticing that $E[X_i] = 1 + (1-p)E[X_i]$, etc.

All this gives $E[X] = \dfrac1p + \dfrac1p + \dots + \dfrac1p = \dfrac{k}p$.


The question has been answered well but for the record, I'm adding an alternate solution using conditional probability. Let $E[N_k]$ be the expectation of the number of tosses for having $k$ heads. Let random variable $X_1$ denote the result of the first toss.

$$E[N_k] = E[N_k|X_1=H].P(X_1=H) + E[N_k|X_1=T].P(X_1=T)$$

When we know that first toss gave $T$, we still need $k$ more heads and the expected value rises by one, i.e. $$E[N_k|X_1=T] = 1 + E[N_k]$$

And when first toss gives $H$, we need $k-1$ more heads and therefore $$E[N_k|X_1=H] = 1 + E[N_{k-1}]$$

Using above, we get

$$E[N_k] = (1+E[N_{k-1}]).p + (1+E[N_k]).(1-p)$$ $$E[N_k] = E[N_{k-1}] + \frac{1}{p}$$

As $E[N_0] = 0$, $$E[N_k] = \frac{k}{p}$$

Tags:

Probability