Why is comparing strings 0(n), but comparing numbers 0(1)?

Numbers in computers are usually handled in fixed-size units. A int might be 32 or 64 bits in any given language and/or compiler/platform combination, but it will never be variable-length.

Therefore you have a fixed number of bits to compare when comparing numbers. It's very easy to build a hardware circuit that compares that many bits at once (i.e. as "one action").

Strings, on the other hand, have inherently variable lengths, so you just saying "string" doesn't tell you how many bits you'll have to compare.

There are exceptions however, as there are variable-length numbers, usually called something like BigInteger or BigDecimal which will behave very similar to String comparison as it might end up being O(n) to compare two BigDecimal values for equality (where n is the length of the BigDecimals, not either of their numeric values).


Usually programs represent numbers as fixed-sized data structures (binary values, which is why you may see their sizes described in bits). Being fixed-size, comparisons would take a constant amount of time and be O(1), which is one of the benefits of such a representation. A downside would be a limit on the range of values that can be represented.

An alternate representation that lifts this restriction, allowing for an arbitrarily-large range of numbers, would thus no longer be fixed in size, and no longer be O(1) to compare.