Easy way to figure out a LED's Vf in order to pick an appropriate resistor

I agree with some of the others here... you're trying too hard.

As others have mentioned, the forward drop of an LED varies with its bias current, but for almost every application a hobbyist will get in to this isn't something you have to spend a great deal of time worrying about.

Almost every handheld multimeter has a diode setting. It will tell you the forward voltage of a diode at the meter's testing bias level (usually a few mA). This will put you in the right ballpark very quickly.

Determining LED Forward Drop (easy way)

  1. Set meter to diode setting (i.e. #14 in this picture).

this picture

  1. Connect the LED to the meter leads, verifying correct polarity
  2. Meter will indicate forward drop (usually 1V-3V for most LEDs.) Note that the LED may glow.

Now that you have the LED's forward voltage drop you can figure out how much voltage everything else in the "chain" will need to drop. For very simple circuits it may just be a limiting resistor. For more complex circuits it may be a bipolar or field-effect transistor, or maybe even something more esoteric. Either way: The voltage through a series circuit will be distributed through all the elements in the circuit. Let's assume a very simple circuit with a red LED, a resistor and the supply.

If the meter indicated 1.2V Vf for the LED, you know your resistor will have to drop 5V - 1.2V or 3.8V. Assuming you want about 10mA through the LED it's now a simple matter of applying Ohm's law. We know that in a series circuit the current through all elements must be identical, so 10mA through the resistor means 10mA through the LED. So:

R = V / I
R = 3.8V / 10mA
R = 380 ohms

If you connect your LED to your 5V supply with a 380 ohm resistor in series, you will find the LED glowing brightly as you intended. Now can your resistor handle the power dissipation? Let's see:

P = V * I
P = 3.8V * 10mA
P = 38mW

38mW is well within the dissipation spec for any 1/4 or 1/8W resistor. Generally speaking, you want to stay well under the power rating for a device unless you know what you're doing. It's important to realize that a resistor that is rated for 1/4W will not necessarily be cool to the touch when dissipating 1/4W!

What if you wanted to drive that same LED with a 24V supply? Ohm's law to the rescue again:

R = V / I
R = (24V - 1.2V) / 10mA
R = 22.8V / 10mA
R = 2280 ohms (let's use 2.4k since it's a standard E24 stock value):

And a power check (using an alternate power equation just to change things up):

P = V^2 / R
P = 22.8V * 22.8V / 2400 ohms
P = 217mW

Now you'll notice that by driving the applied voltage up we have driven the voltage across the resistor up, and that in turn causes the total power dissipated by the resistor to go up considerably. While 217mW is technically under the 250mW a quarter-Watt resistor can handle, it will get HOT. I'd suggest moving to a 1/2W resistor. (My rule of thumb for resistors is to keep their dissipation to under half their rating unless you're actively cooling them or have specific needs laid out in the specification).


If you have a power supply with adjustable current limit (like this one), then it becomes very easy.

  1. Set the output voltage to around 5V and dial the current limit all the way down.
  2. Connect the diode directly to the power supply, with no resistor. Don't worry! You've already limited the current!
  3. Dial up the current until it reaches your target (say, 20mA).

The power supply is limiting the current through the LED to the dialed-in limit. The voltage display will show you what voltage is required to push that much current. That's your forward voltage!


Most common LEDs can handle at least 20 mA, so if you select a resistor value that will pass 20 mA when connected directly across your power supply, a LED will not be damaged when connected in series with that resistor. Then just measure the voltage across the LED to get the LED's forward voltage. The LED voltage will vary slightly with current, but the current you eventually choose to use is not at all critical.

I generally assume that common red, yellow and green LEDs are about 2 volts, and I aim for about 10 mA current (although I recently had some extremely efficient green LEDs where I had to reduce the current to under 1 mA to get the desired brightness (dimness?)). No real need to get extremely scientific about it!