Why are deep cycle batteries rated in amp hours instead of watts hours?

Simply put, watts measures a rate of consumption, and amp-hours or kilowatt-hours measures the total energy consumed. Batteries store energy, so a unit that quantifies the amount of energy stored is useful.

Watts are 'Joules per second', so 1 Watt is the consumption of 1 Joule per second. Watt-hours is the energy consumed by a load with 1 Watt power consumption for one hour. This is the same unit that the power company uses to determine your electric bill.

So the question may be altered to be "Why are batteries expressed in Amp-Hours instead of KiloWatt-Hours", and there's a good reason for that as well. By specifying voltage separately, the unit of 'Amp-Hours' allows convenient recalculation for different series/parallel combinations of batteries, as well as easier calculation of battery lifespan (measuring load current is easier than load power).


Amp hours better specify what the battery stores and provides what Watt hours do.

Amp hours relate to the basic chemical reaction of the battery whereas Watt hours are much more affected by state of charge when charging and discharging and by rate of charge and discharge.

In a LiFePO4 battery the Ah efficiency can by 99.5%+ but the Watt hour (energy efficiency) can be 70% - 90% depending on various conditions and parameters. A standard liIon battery is somewhat similar and a lead acid battery can achieve over 90% current (= Ah) efficiency.


A battery will vary its voltage across its charging range.
Internal resistance x charge current squared = internal resistive losses which is totally wasted energy.

On discharge,
internal resistance x discharge current squared = internal resistive losses
which is totally wasted energy.

In one case the waste energy is reflected by a RISe of Vterminal and in the other by a drop.

When charging, in the earlier part of the cycle the internal resistance is relatively low. The AH (Amp hours) put into the battery are largely recoverable AND the Watt hours also.
But as charging progresses, internal resistance rises, charging energy efficiency drops BUT charging current efficiency is still reasonably high.


Taking a LiFePO4 (also known as “LFP”) battery as a superb example, when new the CURRENT charge to discharge efficiency is about 99.5%. As the battery ages this efficiency INCREASES! i.e. almost all the amps × hours put in can be taken out. BUT the Watt hours put in and Watt hours taken out depend where in the cycle they are put in and how fast they are out in. Watt hours in the early part of the cycle are reasonably efficient but decrease in efficiency as voltage rises


Solar

A Photovoltaic / PV / Solar panel for charging a 12V system typically has 36 cells, an unloaded voltage of > 20V, an "MPP" = maximum power point voltage of perhaps 15V, so that the optimum fully loaded voltage is well in excess of 12V. Attach this panel to a 12V battery and the voltage will fall to a value which depends on battery parameters and state of charge.

When loaded beyond its maximum power point the PV panel will approximate a constant current source.

If a PV panel operates at say 3A, then regardless of the Wattage that the panel produces (V x I) whether say
18V x 3A = 54 Watts or
15V x 3A = 45W or
13V x 3A = 39 W,
what the battery sees is the 3A.
The 3A is what drives the chemical storage reaction and regardless of terminal voltage when the battery is discharged, you will not get more than 3Ah out for any 3Ah put in, and in practice will get less because charging and also discharging never is 100% efficient.

If the battery voltage is at say 12.1V when you draw 3A for one hour and it was charged with a panel that would have charged at 15V x 3A "if allowed"
then the returned versus available energy efficiency is
12.1 x 3A / (15 x 3A) x Kah
=~ 81% x Kah
where Kag is the amp hour efficiency.
If Kah is 0.9 (90%) then overall Watt hour efficiency relative to what the panel COULD have made is 0.81 x 0.9 =~ 73%

It can be argued that it is "not fair" to say a panel "could have made 15V x 3A" when it is loaded down to say 12.5V by a battery, and that is a valid point, BUT of a 15V battery had been used or id=f an MPPT controller which allows the panel to work at its optimum point had been used then the panel would have produced th15V x 3A claimed. Which approach is "right" depends on what you are trying to determine.