Independence of Poisson random variables coming from Poisson sampling

Ok let's start by stating the result you are looking for (according to my interpretation):

Let $\lambda >0 $, $m \sim \text{Pois}(\lambda)$ and also let $p_i> 0 \;,\; i \in \{1,\dotsc,n\}$ such that $\sum_{i=1}^n p_i = 1$. Also let $X = (X_1, \dotsc, X_n) \big\vert m \sim \text{Multinomial}(m, (p_1,\dotsc,p_n))$. It then follows that: $X_i \sim \text{Pois}(p_i\lambda)$ and the $X_i$ are all independent.

I will show the result for $n=2$, it is straightforward to generalize this to $n \geq 2$. In the case $n=2$ we can state the result in a simpler way, namely:

$$ m \sim \text{Pois}(\lambda)$$

and conditional on $m$: $$ \begin{align} X_1 &\sim \text{Binomial}(m, p_1) \\ X_2 = m - X_1 &\sim \text{Binomial}(m, 1-p_1) \end{align} $$

Now let $k,l \in \mathbb N$. Since you already derived the marginal distributions, it follows that:

$$\Pr\left[X_1=k\right] = \frac{e^{-p_1\lambda}(p_1\lambda)^k}{k!} \;,\; \Pr\left[X_2=l\right] = \frac{e^{-(1-p_1)\lambda}((1-p_1)\lambda)^l}{l!}$$

It also holds that:

$$ \begin{align} \Pr\left[X_1=k, X_2=l\right] &= \Pr\left[X_1=k, m=k+l\right] \\ &= \Pr\left[X_1=k \; \vert \; m=k+l\right]\Pr\left[m=k+l\right] \\ &= \binom{k+l}{k}p_1^k(1-p_1)^l \frac{e^{-\lambda}\lambda^{k+l}}{(k+l)!} \\ &= \Pr\left[X_1=k\right]\Pr\left[X_2=l\right] \end{align} $$

From the second to the third line I used the fact that $m \sim \text{Pois}(\lambda)$ and that $X_1 \big\vert m = k+l \sim \text{Binomial}(k+l, p_1)$.

This proves the independence of $X_1$ and $X_2$.


It appears that what you have in mind is that $x_1+\cdots+x_n=1$ and $x_1,\ldots,x_n$ are all non-negative. You use some words in slightly non-standard ways; in particular you say $x$ is a distribution. I'd have said $x$ is a parameter. The number of observations is $M\sim\mathrm{Poisson}(m)$ and each observation is in the $i$th category with probability $x_i$. If you let $M_i$ be the number in the $i$th category, then $M_i\sim\mathrm{Poisson}(mx_i)$.

If I understand correctly, your question is how to show that $M_1,\ldots,M_n$ are independent. You have $M_1+\cdots+M_n=M$.

\begin{align} & \Pr(M_1=k_1\ \&\ \cdots\ \&\ M_n=k_n) \\[6pt] = {} & \Pr(M=k)\cdot\Pr(M_1=k_1\ \&\ \cdots\ \&\ M_n=k_n\mid M=k) \\[6pt] = {} & \frac{m^k e^{-m}}{k!} \cdot \frac{k!}{k_1!\cdots k_n!} x_1^{k_1}\cdots x_n^{k_n} \\[6pt] = {} & m^{k_1+\cdots+k_n} e^{-mx_1-\cdots-mx_n} \frac 1 {k_1!\cdots k_n!} x_1^{k_1}\cdots x_1^{k_n} \\[6pt] = {} & \frac{(mx_1)^{k_1} e^{-mx_1}}{k_1!} \cdots \frac{(mx_n)^{k_n} e^{-mx_n}}{k_n!} \\[6pt] = {} & \Pr(M_1=k_1)\cdots\Pr(M_n=k_n). \end{align}