PC ATX12VO (12V only) standard - Why does everybody say it has higher efficiency?

Modern PC parts still need those rails, for example, SSDs usually use 5V for main operation.

No, NVMe SSDs as attached by M.2 get a 1.5 V and a 3.3 V rail,

We use 5V in USB devices to charge up cellphones and power personal beverage coolers. So the need is not going away (any time soon) just because Intel has published a new standard.

USB is a very good example of something that will not change, indeed.

Why does moving the 5V and 3.3V rails to a different place ...
Why not throwing a few more components into the ATX power supply "box" and down-convert from the high-demand 12V rails?

Because: you can't drive anything that draws any significant current at 5V if that comes from your ATX supply some 40cm of cable away – voltage drop in the cable. You can't drive any high-speed circuitry with a constant voltage through cabling, either – parasitic inductivity will mean you simply can't react to the quickly changing current draw.

So, you always need to convert close to your large loads. That's why there's arrays of rather beefy DC/DC converters around CPU sockets!

With the converters close to the load, you can build many converters that are optimized for the load profile of their individual load – and hence, more efficient on average – than one necessarily massively overdimensioned converter, which nearly never runs close to maximum efficiency.

Linear regulators commonly found on cheap adapters like M.2 to SATA generates 3.3V rails by consuming all the power from the rest of 8.7V as a space heater. Aren't these more problematic and should be eliminated first?

Yes, but these things are aftermarket, and I've not seen anyone use one of these in a modern system. They are effectively never a problem. "My car users sometimes throw an anchor out their window, isn't that more problematic than another liter per 100 km off for a car?" comes to mind....

But you can easily design a multi-stage converter using the main just once.

You seem to be quite the expert on converter design, then... I'd see quite a few problem building that to a high degree of efficiency.


Also, don't underestimate the idle current savings you get when you get rid of a bunch of rails that your target system probably doesn't even need.


  • 12V is the most common high power supply used.
    Producing a single voltage output allows optimisation of efficiency.

  • Routing lower voltages any distance requires substantially heavier wiring for the same power as voltage decreases - or causes substantially higher losses if the same wiring is used. I^2R losses mean that if say 100 Watts was distributed at 12VDC the wire would need to have 16 x the area at 3V to maintain the same loss percent.

  • The continual decrease in the cost of power electronics, and increasing frequency of switchmode supplies, which lead to lower inductor costs, means maximum cost benefit is achieved by using a single "high voltage" 12V supply and point of use converters.


For a mains AC comparison -

Compare this to (in NZ with 230 VAC single phase residential distribution) nationwide AC reticulation at 220 kVAC / 300 kVAC and suburban distribution at typically 33 kV, 11 kV and with 400 VAC 3 phase industrial and in home 230 VAC single phase use.


Simple PC switching power supplies deliver several output voltages with only one transformer, which means only one output voltage can be tightly regulated: preferably the output with the tightest regulation margin, ie 3V3. Other output voltages will be less accurate.

Additionally, LC filters present on each output get in the way of fast transient response, which requires larger capacitors to keep the voltage steady.

The need for higher efficiency mandates synchronous rectification, which adds complexity as the synchronous rectification circuit must be duplicated for each output.

All this means many ATX power supplies are already 12V supplies with a bunch of buck converters on the output to generate the lower voltages, because that makes a better compromise wrt total cost and complexity.

Moving these buck converters to the motherboard makes a lot of sense because it eliminates voltage drop in wires, which is a problem at low voltages. But a very important point is that also eliminates wire impedance. A buck converter placed on the mainboard can have a much better output regulation versus load variations, and a faster transient response, while requiring less total capacitance on the output, when the wire impedance is removed.

Also, a 12V-only wire harness will be cheaper, smaller, allow more airflow...

Considering all these factors, putting the low voltage converters where they should be, on the mainboard, should improve efficiency a bit while also lowering cost and making it easier to wire and easier to cool. Tons of advantages.