Why do many laptops run on 19 volts?

Now there're laptops that use external power supplies rated at exactly 19 volts. That isn't a multiple of anything suitable. Puzzles me a lot.

This is not a design question as posed, but it has relevance to design of battery charging systems.

Summary:

  • The voltage is slightly more than a multiple of the fully charged voltage of a Lithium Ion battery—the type used in almost every modern laptop.

  • Most laptops use Lithium Ion batteries.

  • 19 V provides a voltage which is suitable for use for charging up to 4 x Lithium Ion cells in series using a buck converter to drop the excess voltage efficiently.

  • Various combinations of series and parallel cells can be accommodated.

  • Voltages slightly below 19 V can be used but 19 V is a useful standard voltage that will meet most eventualities.


Almost all modern laptops use Lithium Ion (LiIon) batteries. Each battery consists of at least a number of LiIon cells in a series 'string' and may consist of a number of parallel combinations of several series strings.

A Lithium Ion cell has a maximum charging voltage of 4.2 V (4.3 V for the brave and foolhardy). To charge a 4.2 V cell at least slightly more voltage is required to provide some “headroom” to allow charge control electronics to function. At the very least about 0.1 V extra might do but usually at least 0.5 V would be useful and more might be used.

One cell = 4.2 V
Two cells = 8.4 V
Three cells = 12.6 V
Four cells = 16.8 V
Five cells = 21 V.

It is usual for a charger to use a switched mode power supply (SMPS) to convert the available voltage to required voltage. A SMPS can be a Boost converter (steps voltage up) or Buck converter (steps voltage down) or swap from one to the other as required. In many cases a buck converter can be made more efficient than a boost converter. In this case, using a buck converter it would be possible to charge up to 4 cells in series.

I have seen laptop batteries with

3 cells in series (3S),
4 cells in series (4S),
6 cells in 2 parallel strings of 3 (2P3S),
8 cells in 2 parallel strings of 4 (2P4S)

and with a source voltage of 19 V it would be possible to charge 1, 2, 3 or 4 LiIon cells in series and any number of parallel strings of these.

For cells at 16.8 V leave a headroom of (19−16.8) = 2.4 volt for the electronics. Most of this is not needed and the difference is accommodated by the buck converter, which acts as an “electronic gearbox”, taking in energy at one voltage and outputting it at a lower voltage and appropriately higher current.

With say 0.7 V of headroom it would notionally be possible to use say 16.8 V + 0.5 V = 17.5 V from the power supply—but using 19 V ensures that there is enough for any eventuality and the excess is not wasted as the buck converter converts the voltage down as required. Voltage drop other than in the battery can occur in SMPS switch (usually a MOSFET), SMPS diodes (or synchronous rectifier), wiring, connectors, resistive current sense elements and protection circuitry. As little drop as possible is desirable to minimise energy wastage.

When a Lithium Ion cell is close to fully discharged it's terminal voltage is about 3 V. How low they are allowed to discharge to is subject to technical considerations related to longevity and capacity. At 3 V/cell 1/2/3/4 cells have a terminal voltage of 3/6/9/12 volt. The buck converter accommodates this reduced voltage to maintain charging efficiency. A good buck converter design can exceed 95 % efficient and in this sort of application should never be under 90 % efficient (although some may be).


I recently replaced a netbook battery with 4 cells with an extended capacity version with 6 cells. The 4 cells version operated in 4S configuration and the 6 cell version in 2P3S. Despite the lower voltage of the new battery the charging circuitry accommodated the change, recognising the battery and adjusting accordingly. Making this sort of change in a system NOT designed to accommodate a lower voltage battery could be injurious to the health of the battery, the equipment and the user.


The choice of 19 volts is because is it comfortably below 20 volts which is the maximum output voltage of power supplies that can be certified as LPS (Limited Power Source) with non-inherent power delivery limits.

If you can keep at or below 20 volts, the whole safety certification thing becomes easier and cheaper.

To make sure you're within the limit accounting for manufacturing tolerances, go 5% lower, which is 19 volts. There you are. It has nothing to do with battery pack organization or LCD screens.


Russell's answer ( https://electronics.stackexchange.com/a/31621/88614 ) does a great job of looking at the details. This answer focuses more on the broader aspects of your question.

Typically mobile devices that have a mains-powered supply will accept voltage that is multiple of some single battery voltage.

I don't think this is generally true.

It is true that some devices have power inputs whose rated voltage is some multiple of the nominal cell voltage. They tend to be devices that can run off either mains or battery but that do not charge their own battery from the mains supply. Devices that do charge their own batteries are another matter.

In general you want the input voltage to your charging circuit to be above your battery voltage through the whole charge cycle.

A lithium ion/polymer cell is nominally 3.7V or so but the voltage needed to fully charge it is more like 4.2V and the voltage when fully dishcharged may be more like 3V. Laptop batteries generally have 3-4 cells in series. So 19V gives a reasonable ammount of headroom for the charging circuit.

Mobile phones, tablets and similar mobile devices with single cell lithium ion batteries tend to use an input voltage of 5V. I'm sure this is partly driven by the desire to run off USB but also because it gives a reasonable amount of headroom for charging a single cell lithium ion/polymer battery.