Why are network speeds measured in Kbits/sec instead of KBytes/sec?

Most data communication is serial, one bit at a time. There are no bytes on the wire, a byte is a a parallel arrangement that exists inside a computer. It's the size of the ALU on old CPUs. On the wire you may have "octets", but no Bytes. So the speed on the wire is measured in bits per second, that's what you see there. That may be chunked into octets, but that is arbitrary.


I think the distinction is simply because a byte wasn't always 8 bits. It used to be 6, in fact. The whole concept of a "byte" is arbitrary. Bits on the other hand, are literal. 8 bits are 8 bits.

In networking, many things aren't aligned on byte boundaries anyways, so it just doesn't make sense to use them in that context.


It's part of the tradition. The measurement unit predates computers. Back in the time when teleprinters were common, speed of transmission was expressed in bauds. The Bd were used to show number of symbols transmitted in a second.

When Internet access became available to masses, modems were used for connection and in early modems, 1 b/s was equal to 1 Bd. During this time, somehow bit became equal to baud and it stuck, even in systems where bit rate isn't same as baud rate (for example compression can be used to transfer more data with less symbols or redundancy can be used to transmit less data with more symbols if the signal is likely to get jammed).

On the other hand, this theory does not explain why this is used for other networking equipment.