Why is power consumption sometimes given in mA and not in units of Watts?

Most electronic systems use a fixed voltage and thus you can determine the power based off of the current. This can also come up in systems that use linear regulators in which your current stays the same regardless of the voltage you apply (within a certain range of course). For these systems it makes tons more sense to give a current rating since the power in watts will change based off of the voltage you apply to it.

Also many times it is the current that is the limiting factor because of things like trace/wire width and not necessarily the power that is being consumed.

USB is an example of this limit. The USB specification limits the current used by bus powered devices to a total value per port. The limit is 500mA for v2.0, and 900mA for v3.0; see http://en.wikipedia.org/wiki/Universal_Serial_Bus#Power


A simple multimeter can measure amps or volts directly. Watts have to be calculated from two measurements, and as most meters don't do the calculation for you, it would have to be done by hand. (Consider that digital meters didn't exist for many decades of engineering history.) Speaking in amperes keeps the engineer from having to perform these calculations all day long and speeds up the work considerably.

As the voltage is fixed in many applications (anywhere there's a voltage regulator, e.g. power rail), essentially the same information can by conveyed by speaking in amps rather than watts. For those interested in the actual wattage the calculation is trivial (albeit slightly inaccurate if the voltage is operating a few percent off nominal value).

Keeping more information visible permits deeper insight. In general, circuits are controlled by voltage or current, or at least it's easier to think of the details that way. Those measurements are closer to what's going on with the electrons, while watts measures the rate of energy transfer and tends to be more a descriptor of the heat being dissipated. For instance if you're trying to keep a capacitor from burning out you would care about things like peak voltage, inrush current, ripple current, etc., considered as percentage of maximum allowable (minus derating). It makes a big difference whether your 1 watt is 1 volt x 1 amp or 1kV x 1mA, especially on a small time scale where boundary conditions live. Exceeding specs on a device can destroy it without the device ever getting particularly warm (although it may get quite hot if it fails to a low resistance).

Where it makes more sense to speak in watts is where you have a limited store of energy that doesn't operate as a fixed voltage source, and you want to know how long it will be able to supply energy at a given rate until it's empty, for instance in estimating battery run time. A cooling system might care about watts as a comparison between how fast heat is being generated and dissipated, as the driver of temperature.


No good reason. In general mA isn't enough. But given that USB is 5 V, you can get Watts from mA. As pointed out by @Kellenjb, many chips have (approximately) fixed voltages, this applies to them too.