I'm trying to use 3v LEDs in a 12V circuit, in my car

By decoding your question, this is what I arrive at:

You are using the two resistors as a voltage divider, with the LEDs wired in parallel connected to the junction of these two resistors.

schematic

simulate this circuit – Schematic created using CircuitLab

This is not a valid approach, the voltage divider will not provide the 1:4 (approximately) voltage division your calculations indicate.

Firstly, either apply the three LEDs in series, with one resistor in series, or if you must have them in parallel, each will need a series resistor.

schematic

simulate this circuit

The above indicative values of resistors are assuming 20 mA per LED as optimal. Do your own calculations for your specific LEDs.

For a ready to use online calculator, use an "LED resistance calculator" instead of a voltage divider one. This one even provides the correct schematic or wiring diagram as you prefer, to wire the LEDs up.


The issue is that 3.6V is (very likely) not the maximum voltage for each LED, and therefore you don't need a voltage divider.

The 3.6V max you are mentioning is most likely the voltage drop across the LEDs when the LED is conducting. It is not the maximum voltage, LEDs, don't really care about voltage (this is an approximation, but a useful one). What LEDs do care about is current: they must be protected from is excessive current, and they cannot do this for themselves, so to speak, because they have almost no resistance. Most regular LEDs "like" 20mA maximum, otherwise they melt, shrivel, and die. I don't know about your LEDs, but let us assume 20mA as a good value for current (incidentally, you can find this in the datasheet or on ebay as "Maximum (forward) current", usually given in milliamps; for many LEDs the maximum forward current is 25mA, so 20 is a good value to shoot for). With these assumptions here's what the situation looks like:

After the 3.6V voltage drop across the LED, you have 8.4 volt potential, and the current must be kept at a bit under 20mA. According to Ohm's law (\$V=IR\$), we work out that you need 8.4/0.02 = 420 Ohm current-limiting resistors, assuming the LEDs are in parallel, and a resistor for each LED. How much power is that? Well, power is just the product of current times the potential, i.e \$V*I\$, or volts*amps. We have a 8.4V drop across each resistor, and 20mA current so we get 8.4*0.02 = 0.17W for each resistor. Since your resistors are 1/2 Watt, 0.17W is well within the limits of the resistor.

On a side note, 420 Ohm resistors are less common that 470 Ohm resistors, so just use 470 Ohm resistors. Your LEDs will be a bit less bright, but a bit more safe.

Again I am assuming a parallel setup like this:

schematic

simulate this circuit – Schematic created using CircuitLab

Tags:

Led

Automotive