Big O Notation reliability?

The concept involves "when $n$ is large". The antidote to your problem involves noticing that "large enough to be 'large'" varies between contexts (sometimes 2 is large enough). It looks like your context is too small to be "large" ... and this always needs checking.


this concept (that the big O notation is useless for small $n$) is known and why tuning of algorithms for specific $n$s happens

for example insertion sorting 5 elements in place is faster that using quicksort and this is why many implementations have a if(length<5)do insertion sort guard clause in the recursion

in your example this would mean that you would switch algorithms depending on whether $n^{1.001} \lt 100*n$ or $n\lt100^{100}$


This should be a comment, but there isn't enough room.

There are a number of real life problems where the distinction between fastest theoretical runtime and fastest practical algorithm is apparent. The simplest example I know of is matrix multiplication.

Multiplying two $n\times n$ square matrices is obviously at least $O(n^2)$, but with the algorithm that comes strait from the definition it is $O(n^3)$. For practical purposes, the best algorithm is the Strassen Algorithm in most cases, which is approximately $O(n^{2.807})$. The fastest known algorithm in theory is the Coppersmith–Winograd algorithm, at a much faster $O(n^{2.3727})$. But in practice, the smallest matrices for which this is actually faster than the Strassen algorithm are large enough that they just can't be multiplied on modern computers, regardless of what algorithm you pick.

Incidentally, it's suspected that one can have $O(n^2)$ matrix multiplication (perhaps with some extra sublinear factors), but how to do it has been open for a long time.

So, if by $O$ being reliable, you mean that the fastest algorithm in theory is actually the most useful for practical data sets, the answer is no. On the other hand, what is practically useful is much harder to define, which is why $O$ and its variants are the ways people usually study algorithms.