Will using a resistor in series with a LED to control its voltage increase the total energy expenditure?

You've got the right idea. Partly.

An LED used with a series resistor does waste the energy dissipated in the resistor. Depending on the voltage from the power supply, you can easily waste more energy in the resistor than you use for the LED.

So far, you are correct.

What I want to correct is the idea that the resistor is there to lower the voltage.

The resistor is there to limit the current.

LEDs are current driven devices. The forward voltage varies with current and temperature.

To get a stable brightness out of an LED, you regulate the current.

You will have noticed that the calculation for the series resistor uses the desired LED current. You take the difference of the supply voltage and the (approximate) forward voltage of the LED, and divide that by the current to find the value for the series resistor.

If you tried to regulate an LED only by regulating the voltage, then you would destroy your LED very quickly. Just below the forward voltage, the LED doesn't light up at all. Just above the forward voltage the LED becomes the next best thing to a short circuit. There is a tiny range in between where it lights up and passes only a little bit of current.

That little range is impossible to hit with just a voltage regulator - it moves with temperature and current - current makes the LED get warmer, and warmer makes the LED conduct more. You would be varying the voltage up and down wildly with some kind of feedback circuit measuring the current.

Or, just regulate the current to begin with. Provide no more current than needed to light your LED at the desired brightness, and let the voltage do as it pleases - the voltage is of no interest.


Yes, that resistor wastes power.

If the author is using an LED for an indicator light, then they're wasting a lot more power by their choice of LED. An LED that needs 20mA to show up in a brightly-lit room is typical of 1970's technology. If you shop for higher-brightness LEDs you'll blast your eyeballs out at 20mA, and you'll find yourself stopping the thing down to 1mA or so. One such LED, with a matching resistor, would use 3.3mW at 3.3V, where a 20mA, 1.5V LED alone (never mind the resistor) would use 30mW.

The ultimate way to reduce the circuit power consumption would be to use the most efficient LEDs that you could find, and power them with a switching converter. A decent switching converter will have somewhere between 80% and 95% efficiency, so you'll use between 25 and 5% more power than just the LED. But you'd have to use one per LED (or LED string), and it's hard to justify a super-efficient switching converter for each indicator light.


No, you are not missing anything. The energy consumed by the resistor is wasted but, if you were contemplating a circuit that used tens or hundreds of LEDs you might consider a buck regulator to step the LED circuit supply voltage down to maybe 3 volts and make a significant net power saving per LED drive.

You’ll still need a 50 ohm resistor but it will only be dropping around 1 volt and dissipating only 20 mW.

The good news is that many modern LEDs need only a couple of mA to obtain sufficient brightness for “standard” applications.