What is the maximum acceptable baud rate error for RS-232 serial communication?

[First off, the "RS" in RS-232 denotes "Recommended Standard", which it was when it was released in the 1960s. Since eons however it is an accepted standard by the EIA as EIA-232, and the old name was officially abandoned since. But it seems there's nothing like killing old habits :-) ]

On topic: the allowed clock tolerance depends on the detection method. IIRC the old 68HC11 sampled a few times at a bit start, a few times in the middle of a bit, and the at the end. (Sampling occurred at 16x bit rate.) This allows for half a bit difference on the last bit. Most communication on EIA-232 :-) consists of 10 bits (1 start, 8 data, 1 stop), so half a bit in the stop bit is 5%.


Feel free to edit in more values. These are the ones I'm using now.

Atmel ATmega UART:

5 bits, normal speed : ±3.0%
5 bits, double speed : ±2.5%
10 bits, normal speed: ±1.5%
10 bits, double speed: ±1.0%

NXP LPC111x (Cortex-M0) UART: 1.1% (no table)

Microchip PIC 18F2XK20/4XK20: not defined. (wtf.)

As the venerable Mr. Horowitz & Hill put it:

By resynchronizing on the START and STOP bits of each character, the receiver doesn't require a highly accurate clock; it only has to be accurate and stable enough for the transmitter and receiver to stay synchronized to a fraction of a bit period of the time of one character, i.e., an accuracy of a few percent.

Note that conductor length plays a large part in serial clock errors. I stay under 19200 baud for regular ribbon cable more than 2' long. Makes me jealous of 4' 60Gb/s USB lines.


Bear in mind error is peak-to-peak - this can be an issue at high baudrates on a jittery clock - RC oscillators can have quite a lot of jitter, so a 1% (avarage or RMS) calibrated clock may actually have instantaneous errors larger than that. At lower baudrates this will tend to get avaraged out.