Is it really OK to supply more current than what the component is rated for?

Your misconception stems from one incorrect statement: Since most power supplies provide a current that is higher than what most LEDs can handle, you must put a resistor in front of the LED in order to not burn it out.

The reason for the resistor in series with your LED is that if your power supply supplies a higher voltage than the LED requires, and your power supply is capable of supplying more current than the LED can handle, then you must limit the current your circuit draws from the power supply by using a suitable series resistor.

A 5A power supply will not force 5A through anything you connect to it. It will only allow up to a maximum of 5A to flow and how much actually flows depends on the power supply voltage and the effective total resistance of the circuit connected to it.


To answer the title of your question, the answer is no. It is not ok to supply more current to a component than its rated value.

However, it is ok to have a voltage power supply rated for more current than the components rated value because the component will draw as much as it needs. If you are pushing more current into (forcefully) the component, then the component will exceed its rated value, heat up and be destroyed. Such as if you use a constant current source or you use a large voltage (which will cause more current to flow). But if you use the rated voltage, then the load will only take what is required, regardless of how much current is available to be drawn from the source.

The difference is in how you word your question.


A simple example: You can have a power supply rated for 5V at 1 billion Amps. Now say you attach a resistor to this supply, lets say 5 Ohms. How much current will it draw? (a) 1A, or (b) 1 billion A?

The answer is (a). Ohms law says that I = V/R. Therefore, if you have a 5V supply across a 5 Ohm resistor, you get 1A current flowing? But what happened to the other 999 million or so Amps? Well there wasn't enough voltage to drive that through the circuit. Now if you had a 5e-9 resistor then you would get your 1billion Amps flowing.

In an LED circuit, the diode is non-linear. This means that as voltage increases, current doesn't increase with Ohms law. In fact it is exponential - an LED could conduct 10mA at 2V, but be able to conduct 1A at 2.1V for example - not usually quite that extreme, but you can see that if we don't limit the current, the LED will undoubtedly blow up. How does the resistor help? Well you can consider the LED to be like an ideal voltage source (not quite true, but bear with me). This example LED is essentially dropping roughly the same voltage at 10mA as it is at 1A, so we say, well hey it always has the same voltage, so if we add a resistor, then the voltage over that will be the supply minus what the LED drops. We can then use ohms law to select a resistor which will drop that voltage at the required current level.


Now the point where current rating of a supply becomes important is this. Say you have a supply which is rated at 5V at 10mA. You connect a 5 Ohm resistor to it. What is the current? (a) 1A or (b) much less?

The answer would be (b). Why? Well the supply simply cannot drive that much current - it could be because of its internal resistance, it could be a current source type supply. Whatever. So what happens is either the voltage at the terminals of the supply decreases (because of say, more voltage being dropped across the internal resistance) or (and) it blows up, melts, burns out however you want to phrase it. The key thing here is if the supply survives and the voltage has dropped, then there is less voltage across the resistor which means there will be less current required to satisfy Ohms law - now this all happens in a very quick transient, so essentially all you see is a 5Ohm resistor with a very low voltage across it.


In terms of the direct answer to the question title, the answer is in most cases No. The rated current is what the manufacturer of the component says it will work correct at.

In many cases it could be a component like an LED or resistor (usually limited by power rating not current, but still...) which given a lack of current limiting or the right supply voltage could easily conduct much higher current than its rated for resulting in excessive heating and/or damage.

In other cases if you apply the correct supply voltage, the device will operate at its required current even if you have a supply capable of sourcing much more than that. This is because all devices are in the end just resistors be it fixed value ones or ones which change resistance with voltage (e.g. semiconductors, transistors, etc.). At the given supply voltage the arrangement of these resistances will operate at a current level they are designed for.