I2C: 3.3V and 5V devices without level-shifting on 3.3V-bus?

According to version 4 of the \$\mathrm{I^2C}\$ spec,

"Due to the variety of different technology devices (CMOS, NMOS, bipolar) that can be connected to the I2C-bus, the levels of the logical ‘0’ (LOW) and ‘1’ (HIGH) are not fixed and depend on the associated level of VDD. Input reference levels are set as 30 % and 70 % of VDD; VIL is 0.3VDD and VIH is 0.7VDD. See Figure 38, timing diagram. Some legacy device input levels were fixed at VIL = 1.5 V and VIH = 3.0 V, but all new devices require this 30 %/70 % specification. See Section 6 for electrical specifications." (page 9)

Deeper in the spec, you'll see that this \$ 0.7 \times V_{DD}\$ is the minimum logic high voltage:

excerpt from NXP I2C spec rev. 4

For your 5V system:

\$ 0.7 \times 5 V = 3.5 V\$

\$ 0.3 \times 5 V = 1.5 V\$

To me, the 3.3 V pull-up looks marginal, especially if any of your 5V devices use the 'new' standard of \$ 0.7 \times V_{DD}\$ for logic HIGH.

Your mileage may vary, but it's always best to be within the spec wherever possible...


Cees's answer is incorrect, in particular the "always" and "any". Microcontroller I/Os may need 0.6 Vdd as a minimum for a high level, other have a minimum of 0.7 Vdd, and like Madmanguruman indicates this is the standard for I2C. 0.7 Vdd is 3.5 V at a 5 V supply, so 3.3 V is already too low.

But it's even worse. Voltage regulators often have a 5 % tolerance on their nominal output voltage, so worst case 5 V may be 5.25 V, and then 0.7 Vdd becomes 3.675 V. Minimum input for a high level. If the 3.3 V has a negative 5 % tolerance then 3.3 V becomes 3.135 V. So with tolerances taken into account the input may well be half a volt too low, or 15 %.

So,

So I don't see a reason for a level-shifter as long as all devices detect the voltage from the pull-ups (3.3V) as logical high. That should be the case with devices using 5V as supply.

is an untimely conclusion. Always check datasheets and do the calculation.