Why does increasing voltage result in a decrease in current and not rather a decrease in power?

What I can’t seem to grasp is why this results in current (I) dropping and not rather power (P) increasing?

When we say that increasing the transmission decreases the current, we're considering the case where we're (for example) designing a power transmission system and we have a certain group of customers to serve. For example, a certain number of residences with a certain number of light bulbs and microwave ovens and air conditioners and so on.

The power required by these customers is assumed to be constant, and whatever voltage we choose for the transmission line, we'll convert it to the local mains voltage (120 or 240 V, typically), using a transformer, before delivering it to the customers.

So we'll change the design of the transformers we use to connect the generator to the transmission line, and the transmission line to the end customers, to produce the voltage we want on the transmission line.

That means the power required by the customers stays constant, and however we change the transmission line voltage, the current required varies inversely.


Notice, if other losses are neglected then the electric power $P$ remains constant depending on the supply source (like electric generator). Power $P=VI$ is the product of two variables i.e. voltage $V$ & electric current $I$ through the conductor (assuming power factor $1$).

Therefore if the voltage $V$ increases, electric current $I$ through the conductor decreases and vice versa to keep the power $P=VI$ constant.


I think its a wording problem rather than a lack of physics knowledge in regards to efficiency and power etc. When we say we're altering a single quantity, we must keep the other the same and then determine the effects on the final:

for example, $V$ is increased, $I$ is kept the same, what happens to $P$? It increases. $V$ is kept constant, $I$ is increased then $P$ increases. See?

For this scenario, $P$ is what is kept constant, because $P$ comes in many forms, not just in terms of current and voltage. It's what is generated at the power station from chemical and kinetic energy etc converted into electrical. So $P$ is kept constant because that is what we are generating, irrelevant to the choice of $I$ and $V$ we transfer that power at.

$I$ and $V$ are then chosen from that set $P$ quantity, and for highest efficiency, like the other answers say, greater $V$ is more efficient; $V$ increases so for constant $P$, $I$ must decrease. Big $I$ creates a lot of heat.