Why does working processors harder use more electrical power?

The rate at which these gates are turned on and off seems to have no relation to the power used.

This is where you are wrong. Basically, each gate is a capacitor with an incredibly tiny capacitance. Switching it on and off by "connecting" and "disconnecting" the voltage moves an incredibly tiny electrical charge into or out of the gate - that's what makes it act differently.

And a moving electrical charge is a current, which uses power. All those tiny currents from billions of gates being switched billions of times per second add up quite a bit.


As SK-logic's comment points out most power is really spent on switching flip-flop rather than a steady state.

For dynamically reducing there are two main things you can do IIRC.

  1. if whole areas of a chip are not being clocked you can potentially turn off the power for those areas completely

  2. The clock tree itself is one of the largest power drains in the system, largely as it is the fastest switching part of a system. So reducing the power in the clock tree itself is significant.


The power consumed by an electronic circuit has two components:

  • the leakage, which is more or less independent of the frequency constant and will depend on the technology and working voltage;
  • the switching power, which depends on the frequency (it's due to loading and unloading various capacitances, transistors and wires)

In order to reduce consumption, processor designers use several techniques:

  • modifying the frequency depending on the load (this will act only on the switching power)
  • reducing the power or even powering off parts of the circuits when they aren't needed

These techniques have as a result that depending on your load, you may be better off, from the power consumption POV, either reducing the frequency or doing a "sprint" at full speed and then cutting out a subset of the circuits.