Prove that if $A \subset B$ then $P(A) \leq P(B)$

You probably learned a fact on the lines of "if two events $X$ and $Y$ are disjoint and independent, then $P(X\cup Y)=P(X)+P(Y)$." Since $A$ and $A^c\cap B$ are disjoint, you have \begin{align*} P(B)&=P(A\cup(A^c\cap B))\\&=P(A)+P(A^c\cap B)\\&\geq P(A)+0\\&=P(A) \end{align*} where we used the fact that $P(A^c\cap B)\geq 0$.


$B = A \cup (B\setminus A)$

$P(B) = P(A) + P(B\setminus A) - 0 $

Note: $A$ and $B\setminus A$ are disjoint.

Therefore, $P(A) \leq P(B)$