Doesn't using resistors serial to LEDs all the time waste a lot of energy?

Yes, it wastes power, but most of the time it's not enough power to matter.

In cases where effeciency matters, you use other more complicated means. For example, take a look at the schematic for my KnurdLight example project. This is battery operated and just about all the power is going into the LEDs. In this case I used a boost converter that directly regulated LED current instead of a normal power supply that regulates voltage. There is no series resistor to make a fixed voltage supply look at least partially like a current source because the power supply is a current source in the first place. R6 is in series with the LED string, but is only 30 Ω and is for sensing the current so that the boost converter can regulate it.


Why resistors?

The reason we use resistors to set LED current is that an LED is a diode, and like most diodes, it just looks like a voltage drop when forward biased. There is very little to control current if hooked up to a voltage source; the V/I graph's slope is so steep that a 0.1 V change in diode voltage could mean a 10X change in current. Thus a direct connection to a supply without a workable current limiting mechanism will likely destroy the LED. So we put a resistor in there to make the slope shallow enough to control the current.

Typically, you figure out how much current you want in the LED based on some brightness measurement from the data sheet, or buy one and guess. For typical indicator LEDs, I start with 2 mA for normal or 0.5 mA for high-efficiency LEDs, and usually have to reduce the current further.

Once you pick a current, you take that, the voltage of your source (VS), and the forward voltage of your LED at your current (VF, try to get this from the graph in the data sheet rather than the table, which typically is characterized at 10 mA or more), and plug them into the following equation to get your resistance:

R = (VS - VF) / I

Derivation: Given that the voltage drop across the resistor is VR = I * R (Ohm's Law), that the current in the loop is constant (Kirchoff's Current Law), and that the source voltage is equal to VF + VR (Kirchoff's Voltage Law):

VS = VF + VR = VF + I * R; VS - VF = I * R; R = (VS - VF) / I

High Power LEDs

For applications where the power waste is a problem, such as in large-scale lighting applications, you don't use a resistor but instead use a current regulator to set the LED's current.

These current regulators work like switching voltage regulators, except instead of dividing down the output voltage and comparing to a reference and adjusting the output, they use a current-sensing element (current-sense transformer or low-value resistor) to generate the voltage that is compared to the reference. This can get you lots of efficiency, depending by switching element loss and switching frequency. (Higher frequencies react faster and use smaller components but are less efficient.)


When an LED is driven with a resistor, it's necessary that the supply voltage be higher than the forward drop of the LED; the current drawn from the supply will be equal to the current through the LED. The percentage of supply power that goes to the LED will correspond to the ratio of the LED forward voltage to the supply voltage.

There are other ways of driving LEDs which will work with supply voltages below the forward drop of the LED, or which will draw less current from the supply than they put through the LED. Such techniques may e.g. reduce by half the current drawn from a 5-volt supply to feed 20mA through a 2-volt LED, but the circuitry required will almost certainly be more expensive than a resistor. In many situations, even when running from batteries, the power consumed by an LED will represent a tiny fraction of overall energy usage; even if one could reduce LED-related power consumption by 99% using only $0.05 worth of extra circuitry, the savings wouldn't be worth the cost when compared with simply using a resistor and accepting the sub-optimal efficiency.

Tags:

Led

Resistors