Computer power supplies usually have higher efficiency on 230V than on 115V. Why?

As the law \$ P = U * I\$ , to acheive the same power at lower voltage, you need to increase the current.

In resistive components, like wires, pcb traces, transformer wire (green), losses increase to the square of the current, as \$P(loss) = R * I ^ 2\$.

In switching components and other diodes/rectifiers, (Green) the losses equal to \$ P(loss) = V(bandgap) * I \$. V is bound to the component regardless of the voltage input, like ~1V for a rectifier.

Eddy currents losses (Red) will also increase in any core as the current (and thus electromagnetic field) increases.

enter image description here

Losses related to capacitor leakage are negligible.


As Oskar Skog proposed, the power factor corrector (PFC) is the main suspect.

The PFC is usually a precisely controlled boost converter that converts the pulsating rectified mains to something like 350-400 V. A boost converter's efficiency depends on the difference between the input and the output voltage - the more the input, the less it has to convert.

if someone is to build a power supply that only works on 115V, is it more difficult to achieve the same efficiency as one built only for 230V?

Generally, making a PSU that accepts a wider range of input is harder and leads to more compromises with other parameters (say, efficiency, weight, price).

To a lesser extent, using modern components and in the power range of the computer PSUs, making 230 V-only input is marginally easier and a bit more efficient than 115 V-only input.