Apple - Why does `df` use "Bi" as abbreviation for bytes?

It's the difference between the decimal value and the binary prefix. In this case, it's saying you are using 0 binary bytes.

What's the difference?

Using "Giga" as our example, it means 10003 of something (i.e. Gigahertz).

In computers it poses an interesting problem:

A Gigabyte is 10003 bytes. However a byte is 8 (binary) bits. Which means it's technically 10243 bytes. To account for this, we use different notation:

  • Giga is decimal (base 10)
  • Gibi is binary (base 2)

The output is telling you that it's using binary units.

If you want to get the output in "human readable decimal notation", use a capital "H":

$ df -H
/dev/disk2      1.1T   413G   706G    37% 100935848 172431606   37%   
map auto_home     0B     0B     0B   100%         0         0  100%   /home

Finally, it's actually not an Apple convention, but one from BSD (it's a BSD command). You can find more info on the man page (man df).


Bi means you're in units of 10240 bytes, instead of 10000 bytes.

i.e. they're the same unit, but wouldn't be with larger prefixes that mean non-zero exponents. It looks like df is just being pedantic, as a way to be consistent when in power-of-2 units mode.

This is a made-up convention: there is no metric or IEC "Bi" unit, only 2-letter IEC prefixes that end with "i", for use with quantities of bits or bytes. (e.g. Mi for mebibytes or mebibits.) And no, you're not expected to ever say that out loud un-ironically with a straight face.

"iB" might make more sense (binary bytes with no prefix), but it's not a thing either.

Tags:

Bash

Storage