Why does the monotone convergence theorem not apply on Riemann integrals?

Riemann integrable functions (on a compact interval) are also Lebesgue integrable and the two integrals coincide. So the theorem is surely valid for Riemann integrals also.

However the pointwise increasing limit of a sequence of Riemann integrable functions need not be Riemann integrable. Let $(r_n)$ be an ennumeration of the rationals in $[0,1]$, and let $f_n$ be as follows:

$$f_n(x) = \begin{cases} 1 & \text{if $x \in \{ r_0, r_1, \dots, r_{n-1} \}$} \\ 0 & \text{if $x \in \{ r_n, r_{n+1}, \dots \}$} \\ 0 & \text{if $x$ is irrational} \\ \end{cases}$$

Then the limit function is nowhere continuous, hence not Riemann integrable.


Here is a version of the Monotone convergence theorem for Riemann integrals that can be proved without referring to measure theory:

Theorem. Let $\{f_n\}$ be a nondecreasing sequence of Riemann integrable functions on $[a,b]$ converging pointwise to a Riemann integrable function $f$ on $[a,b]$. Then $$ \lim_{n\to \infty}\int_a^b f_n(x)\,dx=\int_a^b f(x)\,dx. $$

An elementary proof is given in this paper.