How to learn a continuous function?

The answer is no by a Cantor diagonal argument:

Let $\Omega=(0,1)$.

Let $G$ be all functions that can be computed by a finite number of registers with finite precision. It does not matter where $G$ is learnt from.

  • The number of states of $n$ registers with precision $m$ is finite, thus the number of functions computable on $n$ registers with precision $m$ is finite. Let the set of such functions be $G_{mn}$.

  • Thus, $G=\bigcup_{m\in \mathbb{N}} \bigcup_{n\in \mathbb{N}} G_{mn}$ is countable. Label the elements of $G$ by $G_1,G_2,...$

Since there're an infinite number of disjoint intervals contained in $\Omega$, it's possible to avoid each $G_i$ on some interval on $\Omega$.

Let $H_n=[1-10^{-n}+\frac1310^{-n},1-10^{-n}+\frac2310^{-n}]$ be a closed interval on $\Omega$. Since $G_n$ is measurable, we can find a continuous function $f_n$ that agrees with $G_n+1$ on at least half of $H_n$ (i.e. the measure of $\{f_n=G_n+1\}$ is at least half of that of $H_n$).

Let $f$ be a continuous function on $\Omega$ that agrees with $f_n$ on $H_n$ for every $n$.

Then, for each $n$, $||f-G_n||_{L^\infty(\Omega)} \geq ||f-G_n||_{L^\infty(H_n)} \geq ||f_n-G_n||_{L^\infty(\{f_n=G_n+1\})}=1$.


The answer is NO from general no-free-lunch principles. In particular, the collection of all continuous functions has infinite fat-shattering dimension, and hence is not learnable in your sense. See Alon, Ben-David, Cesa-Bianchi, and Haussler - Scale-sensitive dimensions, uniform convergence, and learnability.


Counterexample: sin(1/x) over (0,1) Learning the function near 0 requires infinitely many samples.