Minimum energy required for LED blink to be visible

So, friends, I did the experiment.

The setup was two 5mm LEDs (I'm not sure what type exactly, but most probably they have 60 degs of light distribution and 40 mcd of maximum light - still didn't get the way they measure this intensity): red with 330 Ohm series resistor and green with 160. Both with 5V supply and AVR microcontroller.

With this setup I was able to see the blink as short as 1 us for green and 2 us for red LED. I should point out that I was in well lit room but I put my palms around the LEDs to make a 3 inches deep well around LEDs. I looked directly on the LEDs and I was expecting the blink. So this light is definitely not enough to notice the blink if you are not expecting one.

The current can be estimated as 3.8 Volts / 330 Ohms = 11,5 mA for red and 23 mA for green.

So the electrical power is 11,5 mAmps * 1,2 Volts = 14 mW for red and 28 mW for green.

Sequentially the blink electrical energy was as low as 28 nJ (nano Joules !!!) in both cases. Which is about ten times more than I expect to spend on a blink!

I test this on my wife and my 7-yo daughter. Same thing.

Regarding the energy distribution versus time:

Unfortunately I wasn't able to change resistors so I made just one thing: I put the LED to a constant light mode with 1% PWM. And I did not notice any difference if I change the frequency (1 us blink each 100 us is equally lit as 100 us blink each 10 ms). This is not exactly what I need but it looks like it's not a big deal how I will distribute the power in time.

Regarding the sensitivity of the different areas of an eye: I was able to see the blink only if I look exactly on the LEDs. If I shift the eye sight axis a little bit - I wasn't able to see anything. The same thing I noticed with constant lighting.


Wikipedia suggests something on the order of 100 photons to achieve a visible response in the most ideal conditions.

The energy in a photon can be calculated by:

$$ E = {hc \over \lambda }$$

Where:

  • h is the Planck constant, approximately 6.6×10−34 J⋅s,
  • c is the speed of light, about 3×108 m/s, and
  • λ is the wavelength of the photon.

The human eye's rod cells are most sensitive at a wavelength of 510 nm. So the energy of those photons is about 3.9×10−19 joules per photon.

Multiply that by the about 100 photons required for detection by a human eye, and you get 3.9×10−17 joules. By the law of conservation of energy, you will need at a minimum this much electrical energy to make anything visible.

Of course LEDs aren't 100% efficient. Not all colors have the same luminous efficacy, so it may be that the most efficient LED for making human-visible light isn't necessarily at the wavelength where the eye is most sensitive. I'll leave that research as an exercise, and let's just say an LED has a luminous efficacy of 25%. That increases the energy required by a factor of four, to:

1.6×10−16 joules

That is, by my rough estimation, the absolute minimum energy required to register a visual response with an LED in a human.

You have orders of magnitude more energy stored in your capacitor, so under ideal conditions, it's likely you could register a visual response even after accounting for inefficiencies in getting the power from the capacitor into the LED.

Of course in practice the room won't be perfectly dark, the viewer won't be ideally acclimated, and the LED's light won't be focused to a tiny spot. So you may require more energy. Perhaps much more.


For some good hints about how to flash an LED on low power, look up the now-obsolete LM3909 LED flasher chip. Note how it "stacks" the voltage from the cap with the voltage from a 1.5V cell to get enough forward voltage for the LED.

One normally used a capacitor in the range of tens of µF (not tens of nF) with this chip to produces a very visible flash on a LED of only moderate efficiency. I would estimate that this supplied about 50 µJ per flash, so you're probably an order of magnitude or two short of where you need to be.