Voltage at what Amperage

Voltage (which is kinda like the strength of the supply), and Current (measured in Amps, which is the quantity of electricity), are two very different things.

Voltage: When trying to match a supply to a device, you need to get the voltage right... if the supply voltage is too high, then it will damage your device. If the supply voltage is too low, then your device just won't work.

Current: When looking at current, you need to ensure that the Amps rating is higher than the device needs as it will only use as much electricity as it needs. If the rating is too low for the device, it will be trying to get more electricity from the supply than the supply can provide, and so it will get hot and possibly explode. If you had a supply that was rated at 1 billion amps, then it would happily power a tiny bulb... it just means it could also power 1 billion bulbs or more a the same time!

So, the possible dangerous situations are:

  1. If the voltage is too high for the device.
  2. If the amps are too low for the device.

As a general rule, devices that produce a lot of heat or light or movement usually need a high current supply. Devices that control things, like a TV remote or some small gadget with maybe a few LEDs on it, won't need a lot of current.

To answer your question, the microcontroller itself probably only needs between 0.02 and 0.1 amps. If the microcontroller is controlling something else, and sharing the supply, then the current rating of the supply really depends on the device.


If you connect a 5 V 100 mA device to a 5 V 1 billion amp power supply, the device will draw 100 mA.


Of course it wouldn't, a device will only take as much current as it requires (Ohm's Law). The maximum current capability of the supply is irrelevant, as long as it is greater than the peak current rating of the device.