What is the "fastest" increasing function that's useful in some area of math?

The busy beaver function $BB(n)$ is (informally) an upper bound on the amount of time a computer of size $n$ (that is, an $n$-state Turing machine) can compute without going into an infinite loop. It increases much more quickly than does Ackermann's function; so much so that it can't be computed at all. In fact, it increases more quickly than any function that can be computed.

The busy beaver function shows up in all sorts of examples of non-computability. For example, Are there any examples of non-computable real numbers? asked for an example of a real number that can't be computed by any process, and among the answers was $$\sum_{i=1}^\infty 2^{-BB(i)}=2^{-1}+2^{-6}+2^{-21}+2^{-107}+\ ... \ \approx 0.515625476837158203125000000000006$$ in which the busy beaver function plays a central role.


You mentioned the fast growing hierarchy (FGH), but then dropped it, deeming it useful only for the purposes of the fact that it grows fast. But to a googologist, the fast growing hierarchy is something more. (ok, so a googologist studies large numbers, but still!)

One of the most fundamental things about FGH is that it's well suited for induction and able to grow fast. This allows it to be a compact way to approximate large things and compare them. For example, it can be seen that

$$\underbrace{f_\omega(2+n)>2+3\uparrow^n3}_{\text{easily proven}}$$

$$\begin{align}f_{\omega+1}(64)&=\underbrace{f_\omega(f_\omega(\dots f_\omega(64)\dots))}_{64}\\&>\underbrace{f_\omega(f_\omega(\dots f_\omega(2+4)\dots))}_{64}\\&>\underbrace{f_\omega(f_\omega(\dots f_\omega(2+3\uparrow^43)\dots))}_{63}\\&>\underbrace{f_\omega(f_\omega(\dots f_\omega(2+3\uparrow^{3\uparrow^43}3)\dots))}_{62}\\&>\dots\\&>2+\text{Graham's number}\end{align}$$

Likewise, it is showable that $f_{\omega+1}(63)$ is less than Graham's number. To a googologist, this function is great, as it now gives you a manageable form of Graham's number to compare to other large numbers.


Another famous example of FGH is its relation to the Goodstein sequence and that the Goodstein sequence cannot be proven to terminate in Peano arithmetic.

The FGH is stronger, and likewise, it can be proven that any function on the order of $f_{\epsilon_0}(n)$ and higher cannot be proven to terminate in Peano arithmetic, where

$$\epsilon_0=\sup\{\omega,\omega^\omega,\omega^{\omega^\omega}\dots\}$$

which is probably not only simpler to understand and straight forward than the Goodstein sequence, but a generalization.


The last little point I will make is that FGH can grow as fast as any computable function. This means that we can approximate things like $\operatorname{TREE}(3)$!

$$\operatorname{TREE}(n)\approx f_{\theta(\Omega^\omega\omega,0)}(n)$$

as explained in this answer.


So argue it one way or another, the fast growing hierarchy is indeed useful for more than just growing fast, though maybe not much useful for things outside the realm of large numbers.


On a different matter -- it does not (even remotely) grow as fast as the Busy-Beaver function, but there exist results saying that a general family of properties of graphs can be $\varepsilon$-tested in $\phi(\varepsilon)$ queries to the graph [GGR98], where $\phi(\varepsilon)$ is of the form $$2^{2^{2^{\cdot^{\cdot^{\cdot^2}}}}}$$ where the dots hide a tower of $1/\varepsilon^5$ $2$'s. (See e.g. these lecture notes.)

Furthermore, it can also be shown that a tower of $2$'s of height at least $\log^2(1/\varepsilon)$ is necessary in the query complexity (see this).