Why does a C.D.F need to be right-continuous?

Well, in a finite measure (by which I mean a finite $\sigma$-additive measure) space, if $\{A_i\}_{i\in\Bbb N}$ is a sequence of measurable sets such that $A_i\supseteq A_{i+1}$ for all $i$, then $\mu\left(\bigcap_{n\in\Bbb N}A_i\right)=\inf_{n\in\Bbb N} \mu(A_i)=\lim_{n\to\infty} \mu(A_i)$. In your special case where all the $A_i$-s are hyperrectangle in the form $R\left(a^{(i)}\right)=\left(-\infty,a^{(i)}_1\right]\times\cdots\times\left(-\infty, a^{(i)}_k\right]$ and $\mu=\Bbb P_X$, this translates to $$\mathbb P_X\left(R(a)\right)=\mathbb P_X\left(\bigcap_{i\in\Bbb N} R\left(a^{(i)}\right)\right)=\lim_{n\to\infty} \mathbb P_X\left(R\left(a^{(i)}\right)\right)$$ for all $a^{(i)}\searrow a$. Which is in fact continuity on the right of the CDF.


This can be proven from the "continuity of probability" result for events that shrink to a limiting event: $$A_n\searrow A \implies P[A_n]\rightarrow P[A]$$ (and this is derived from the countable additivity axiom).


One reason this is important is that it helps students to be precise when they draw pictures of CDF functions. They need to learn to be detail-oriented enough to respect this issue when points of discontinuity arise.

Another reason for importance is that it relates to this question:

Question: "What functions $F:\mathbb{R}\rightarrow\mathbb{R}$ are valid CDF functions?"

Answer: A function $F(x)$ is a valid CDF, meaning that there exists a random variable $X$ for which $P[X\leq x] = F(x)$ for all $x \in \mathbb{R}$, if and only if these four criteria are satisfied:

  • $F(x)$ is nondecreasing.
  • $\lim_{x\rightarrow-\infty} F(x) = 0$.
  • $\lim_{x\rightarrow\infty} F(x)=1$.
  • $F(x)$ is right-continuous.

So the right-continuous property has a place of prominence in this fundamental question.


This fact is useful to resolve this natural question: Let $\{X_i\}_{i=1}^{\infty}$ be i.i.d. random variables uniform over $[-1,1]$. Define $$ L_n = \frac{1}{n}\sum_{i=1}^n X_i \quad \forall n \in \{1, 2, 3, ...\}$$ Does there exist a random variable $Y$ for which the distribution of $L_n$ converges to the distribution of $Y$? The answer is "no" because: $$ \lim_{n\rightarrow\infty} P[L_n\leq x] = \left\{\begin{array}{ll} 1 & \mbox{ if $x >0$}\\ 1/2 & \mbox{ if $x=0$}\\ 0 & \mbox{ if $x<0$} \end{array}\right.$$ and, because this is not right-continuous, this is not a valid CDF function for any random variable.

Of course, the CDF of the always-zero random variable $0$ is the right-continuous unit step function, which differs from the above function only at the point of discontinuity at $x=0$. Such issues are the reason why the definition of "$Y_n\rightarrow Y$ in distribution" has the caveat that the convergence $P[Y_n\leq y] \rightarrow P[Y\leq y]$ only needs to take place at points $y$ where $P[Y\leq y]$ is continuous. With this caveat in mind, it is correct to say that $L_n\rightarrow 0$ in distribution (and of course we also know $L_n\rightarrow 0$ with probability 1 by the law of large numbers).


It doesn't "have" to be. A distribution function is defined either as $$F_X(x)=\mathbb{P}_X((-\infty,x])=\mathbb{P}(X\leq x)$$

Then it is right continuous (follows from continuity of measures from above). It could be defined as $$F_X(x)=\mathbb{P}_X((-\infty,x))=\mathbb{P}(X<x)=1-\mathbb{P}(X\geq x)$$ Then it is left continuous, which again follows from continuity of measures.