LED power consumption in theory and reality

Your PSU meter is only has resolution to 0.01 A (10 mA). The actual current could be anywhere between 5 mA and 15 mA for the single LED.

Switch your yellow multimeter to mA range, connect the leads into the correct sockets and wire the multimeter in series with the one LED and get a more accurate measurement.

Paralleling LEDs in this fashion is not recommended. The ones with the lower forward voltage drop will hog the current. Either connect them in series with a current limited supply or put a resistor in series with each LED to limit the current.

transistor and Passerby have both made perfectly good answers to the question you've asked, but let me try something a bit more comprehensive.

You seem to have goodly number of LEDs, and if you have a few spares, try this experiment. Drive 1 LED at 1.9 volts. Record the current. Increase the voltage to 2.0. Now try 2.1. You'll see that the current increases very rapidly, and I'd be surprised if 2.1 volts doesn't kill the LED. Now replace the LED with a 200 ohm resistor and repeat the test. This establishes that current rises much more rapidly with an LED than with a resistor once the turn-on voltage is reached.

Now, here's something you don't know - for a fixed voltage, the current through an LED will increase as the LEDs temperature increases.

Because it's getting hotter, its current will increase, and so will its temperature. Which, of course, means that its current will increase still more. You can see where this is leading - the technical term is thermal runaway. So this leads to the first and most important rule: never try to drive an LED from a voltage source. Always limit the current. This is most easily done by providing a higher voltage and using a current-limit resistor in series. In your case, a 5-volt supply and a 300 ohm resistor will give about 10 mA safely.

Furthermore, your setup shows that you got lucky in your choice of LEDs - they all seem to be about the same brightness. As Passerby stated, this is not generally true. So don't tie a bunch of LEDs together and drive them from a single resistor. Doing so will invite a range of brightness in the LEDs. If you don't want uniform brightness, you might think that this is OK, but there is one more thing to consider.

Let's say you have 10 LEDs in parallel, each drawing (you hope) 10 mA, for a total of 100 mA. To do this you use a 5 volt supply and a 30 ohm resistor. You're fine with the non-uniform brightness. Is there a problem?

Quite possibly. Just as the LEDs are not uniform in brightness for the same voltage, neither do they draw the same current at the same voltage.

Let's say that one of the LEDs naturally draws a bit more current than the others at the common voltage. This means that, since power equals voltage times current, it dissipates more power than the others, and this means it will get hotter. In turn this drops its voltage more, and it draws more current. In the worst case, the weakest LED will hog more and more current until it burns out, and it will probably fail open. This means that the next-weakest LED will start hogging current, and in the worst case the process will continue until all the LEDs are dead. This process can occur with other components as well, and has gained the nickname "firecracker mode". It is made possible in this case by a current limit which is set too high: that is, the 100 mA current limit set by the 30 ohm resistor allows a worst-case current distribution to kill the LEDs.

This leads to the other rule you should follow: limit the current to each LED separately. This usually means one resistor per LED, or string of LEDs in series. For instance, if you have a 12-volt source, you could put 4 or 5 LEDs in series, and use a single resistor to limit the current in the string. You can often get around this for small numbers of LEDs, as long as you are aware of the consequences. With 2 LEDs in parallel, you probably don't have to worry about firecracker mode failures, since not many LEDs will die at twice the normal operating current, but you will still probably get unequal brightness. The more LEDs you put in parallel, the greater the odds of catastrophic failure. The choice is up to you, and you will probably want to take chances until you've been burned a few times.

"Good judgement comes from experience. Experience comes from bad judgement."

You are assuming that each of these leds have perfectly identical IV curves. The stated specs are nominal, typical figures, and there will be variations.

One LED may be 10mA at 1.9 VF, but another may be 8 or 12 mA or different at the same VF. That's not even taking brightness into account. Two leds with the same IV curve can be noticeably different in color and brightness too.

You also have to account for your supply's precision or rounding. It only measures to the 100th of an Amp. Not enough for proper single milliamp range.

Also take into account the resistance of the breadboard you are using. If you measure the voltage across the first led and the last led, you may notice a difference.

You should use a good ammeter or multimeter in current mode, and individually measure each of the leds in this circuit to see how much each led is actually consuming.