Clarification of the concept "less resistance means less heating" in a wire

They’re describing the situation where the wires are carrying power to a load. It’s the load that (mostly) determines the current in the wires leading to it.

A $1200$W oven on $120$V needs $10$A.

Once the load has determined the current, the heat in the wires is given by their resistance via $I^2 R_{wire}$.

A $0.02$ ohm wire to the oven will have $2$W of heat; a $0.01$ ohm will have less: $1$W.

That difference in wire resistance doesn’t change the current much because the current is really controlled by the ~$10$ ohm heater resistance. But it changes the wire heat a lot.


The rate of heating of a wire is the power dissipated in the wire which is

$$P=I^{2}R$$

The resistance of wiring is much less than the resistance of the loads that the wiring supplies current to. Therefore the size of the wire has little effect on the current supplied to the loads (within reason). In other words, we can consider the current to be constant for different wire sizes within a limited range of sizes.

For a fixed current load ($I$ = constant), the greater the wire (conductor) resistance the greater the heating of the wire, the less the resistance the less the heating of the wire. So a larger conductor for a fixed load current produces less heat because its resistance is lower.

Finally, overheating of undersized conductors is a concern because it can cause the failure of the insulation on the conductors due to melting or long term thermal degradation if temperatures exceed the insulation temperature rating. Since insulation may be relied upon to reduce the risk of fire and electric shock, overheating of the insulation increases the risk of fire and electric shock.

Hope this helps.


A secondary effect that the other answers don't talk about: thicker wires have more surface area through which to dissipate the warmth generated by the electric current running through the wire.

If you double the diameter of a wire, there will also be twice as much surface area, so twice as much heat can dissipate, given the same wire temperature. If the wire is in thermal equilibrium with its surroundings, the heat dissipated must be the same as the power generated inside the wire, so doubling the diameter of the wire will also double the power generation that the wire can handle while staying at a particular temperature. Alternatively: the double-diameter wire will only be half as much hotter than the environment than the single-diameter wire, if both wires generate the same amount of power.

E.g. if the environment is 300K, and the single-diameter wire is 330K, then the double-diameter wire will only be 315K, at the same power generation.

This effect is inversely linear in the diameter of the wire, rather than the quadratic behaviour of the actual power being generated inside the wire, so it's a smaller effect in general. But if power generation is proportional to $1/r^2$ (see the other answers on why this is the case) and temperature difference with the environment at constant power generation is $1/r$, then total temperature difference with the environment should be $1/r^3$. (Here, $r$ is the radius of the wire.)