Why do many electronics operate on 5 AND 3.3 Volts?

5 V became much used in early logic families, and especially TTL. While TTL is very much passé now everybody still talks about "TTL levels". (I even hear UART decribed as "TTL bus", which is a misnomer: it's a logic level communication channel, but may well be a different voltage than 5 V.) In TTL 5 V was a good choice for the setpoints of the BJTs and for a high noise immunity.

The 5 V level was retained when technology switched to HCMOS (High-Speed CMOS), with 74HC as the best-known family; 74HCxx ICs can operate at 5 V, but the 74HCT is TTL-compatible for its input levels as well. That compatibility may be required in mixed technology circuits, and that's the reason why 5 V won't be completely abandoned soon.

But HCMOS doesn't need the 5 V like TTL's bipolar transistors did. A lower voltage means lower power consumption: an HCMOS IC at 3.3 V will typically consume 50 % or less power than the same circuit at 5 V. So you create a microcontroller which internally runs at 3.3 V to save power, but has 5 V I/Os. (The I/O may also be 5 V-tolerant; then it works at the 3.3 V levels, but won't be damaged by 5 V on its inputs. Next to compatibility 5 V also offers a better noise immunity.

And it goes further. I've worked with ARM7TDMI controllers (NXP LPC2100) with a core running on 1.8 V, with 3.3 V I/Os. The lower voltage is an extra power saving (only 13 % of a 5 V controller), and lower EMI as well. The drawback is that you need two voltage regulators.

So that's the trend: internally ever lower voltages for lower power consumption and EMI, and externally a higher voltage for better noise immunity and connectivity.


Sure. But remember that power consumption increases with the square of voltage. Increasing the voltage used from 3.3V to 5V increases power consumption by 2.3 times. Therefore there is value in using as low a voltage as possible, even if there are some losses in the power supply from the conversion.


Most electronics had +5 available onboard but, when circuits only required 3.3v, it was easier to drop the voltage on the chip rather than require manufacturers to re-engineer their power supplies and boards to add 3.3v. The lower voltage not only reduces power consumption but in high-speed digital circuits it takes less time to swing from one rail to the other.