PDF of a sum of exponential random variables

Let's start by observing that the conditional random variable $Y\mid N$ follows $\Gamma$-distribution with parameters $N$ and mean $\mathsf{E}(Y|N) = N \mathsf{E}(X) = N \lambda$.

Then, for $y \gt 0$

$$ f_Y(y) = \mathsf{E}\left(f_{Y|N}(y)\right) = \mathsf{E}\left( \frac{\lambda^N y^{N-1}}{(N-1)!} \mathrm{e}^{-\lambda y} \right) $$ $$= \sum_{n=1}^\infty \frac{\lambda^n y^{n-1}}{(n-1)!} \mathrm{e}^{-\lambda y} (1-p)^{n-1} p $$ $$ = \lambda p \mathrm{e}^{-\lambda y} \mathrm{e}^{\lambda (1-p) y} $$ $$= \lambda p \mathrm{e}^{-\lambda p y} $$ Hence $Y$ is also the exponential random variable.

Another way of seeing this could be via use of characteristic function: $$ \phi_Y(t) = \mathsf{E}\left(\exp\left(i t Y\right)\right) = \mathsf{E}\left(\mathsf{E}\left(\exp\left(i t Y\right)\mid N\right)\right) = \mathsf{E}\left( \left(\phi_{X}(t)\right)^N \right) = \frac{p \phi_X(t)}{1-(1-p) \phi_X(t)} $$ using $\phi_X(t) = \frac{\lambda}{\lambda - i t}$, and then, by rearranging terms we get $$ \phi_Y(t) = \frac{\lambda p}{\lambda p - i t} $$ confirming that $Y$ is exponentially distributed with parameter $\lambda p$.


Another way to understand Sasha's answer:

Consider a Poisson Process with rate $\lambda$. It is known that the waiting time between events is distributed according to a Exponential$(\lambda)$. Now consider a new process in which you remove events independently with probability $p$. This is a new Poisson Process with rate $p \lambda$. Hence, the waiting time is Exponential($p\lambda$). Observe that the waiting time between events in the new process follows the distribution you are looking after (where $X_{i}$ are the waiting times in the previous process).

Tags:

Probability