What's up with the operating voltages 5 V, 3.3 V, 2.5 V, 1.8 V, etc.?

A lower VDD is required as the gate geometry shrinks. This prevents damage to the CMOS gate oxide and minimizes leakage. When the fabs switched from 0.5um to 0.35um, the thinner gates could only handle potentials up to 3.6V. That led to supplies at 3.3V +/- 10%. With the switch to 0.18um the voltage was reduced further to 1.8V +/- 10%. In the latest processes (e.g. 45nm), the gates are made of high-k dielectrics such as halfnium to reduce leakage.


That's a combination of several factors:

  • conventions - it's easier to design a system when the chips are supplied with the same voltage. Even more important is that the supply voltage determines voltage levels of CMOS digital outputs and voltage thresholds of inputs. The standard for chip-to-chip communication used to be 5V, nowadays it is 3.3V, although recently there was an explosion of low voltage swing serial communication interfaces. You could say that here "the industry" decides the supply voltage.
  • CMOS manufacturing process limitations - as the MOS transistors shrink, so does the thickness of the gate insulation material and the channel length. As the result the supply voltage has to be lowered to avoid reliability issues or damage. To maintain a "convenient" supply voltage at I/O interfaces (like 3.3V - see above), these cells are made using different (bigger and slower) transistors than the core of the chip. Here the "fab" (whoever designed the manufacturing process there) decides the voltage.
  • Power consumption - at each process generation a chip can accommodate 2x more transistors, running at x2 higher frequency (at least that was true until recently) - if nothing is done that gives 2*2=4 times increase in power consumption per unit area. To reduce it the supply voltage is (or was) scaling down proportionally to transistor sizes, leaving 2x increase in power/unit area. Here the chip designer's voice is important.

Recently the picture got more complicated - the supply voltage can't easily scale down because of limited intrinsic transistor gain. This gain presents a tradeoff (at a given supply voltage) between the "on" resistance of the transistor channel, which limits switching speed, and "off" resistance that causes current leakage through it. That's why core supply voltage settled at around 1V causing the speed of new digital IC chips to grows more slowly and their power consumption to grows faster than it used to be. Things are getting worse if you consider manufacturing process variability - if you can't position transistor switching threshold voltage accurately enough (and as transistors are getting smaller it becomes very difficult) the margin between "on"/"off" resistances disappears. The variability is an engineering problem so, at least in theory, it is fixable, but limited gain of MOS transistors is something we have to live with until we get better devices.


New voltages have often been chosen to give some degree of compatibility with what came before them.

3V3 CMOS output levels were compatible with 5V TTL inputs, for example.