Low amps high volts vs high amps low volts?

Going off of your post content rather than the title, I think you are asking why use higher voltage and lower current vs lower voltage and higher current to drive an equal amount of power.

It is often more efficient to use higher voltage and lower current. Wires often have an ampacity rating, which is how much current they are rated to handle. If you double the voltage and keep current the same, you can use the same wire to drive a load twice as big.

However, there are often practical limits to how high of a voltage is used. At higher voltage, certain safety considerations come into play. Wires and other components have voltage ratings, and going above those ratings may cause the electric field to overcome the dielectric strength of the insulation, leading to a loss of insulation and an electrical short.

There are more complex reasons why certain devices, such as microcontrollers, all use 3.3V or 5V. This Quora post has some good explanations. It seems that transistors in use at the time had a voltage drop of 0.7V and many useful combinations of transistors required at least 3V. Also, if these devices use a higher voltage, more space would be required to insulate internal components, or material with a higher dielectric would have to be used which may not be cost effective.


In terms of power, they are frequently equivalent. However there are always larger systemic conditions and constraints.

Higher voltage is often used to send large amounts of power over long distances specifically because the current is lower and there is lower loss through the transmission system (for example in the public power utility grid).

But sometimes it is more efficient to use lower voltage and higher current. For example a car battery. Because of the way batteries work is is more efficient to use large cells capable of 100s of amps because the distance to the load (starter motor) is quite short.