Lithium Ion prolonged lifespan

The question is a bit messy.

First, "optimal" might mean different things. Charging Li-Ion to 3.92V will likely yield 70-80% capacity. Is it optimal? It depends on battery use/discharge pattern. If the device is meant to be in "hot reserve" like emergency flashlight, 80 % is optimal. If the device is meant to start working at full power immediately (like LED lights or quadcopter), then you wasted 20-30% capacity for no reason.

Second, "consumer chargers" do not "focus" for longest battery life when charging to 4.2V. The magic "4.2V" is because of mainstream battery specifications. And this level is determined by battery manufacturer, which is based on standard expectations of having 500-1000 charge-discharge cycles before rated capacity drops below 70% level. Due to improvement in technology and materials, there are Li-Ion cells that can be charged with 4.35 V while maintaining the same or better cycle life.

Third, if you charge a cell to 4.2 V, it might take a while before it self-discharges to 3.92 V, which makes no sense.

In general, there is certain (complex relationship) trade-off between charger float voltage, termination current, rate of charge, rate of discharge, shell time in fully-charged state, shell time in discharged state, and SOH - State of Health of battery. Siting in a fully charged state reduces battery cycle life, and sitting in fully discharged state has similar effect. So there is no universal recipe how manage "lifespan" of a Li-Ion battery.

EDIT: Based on studies of some cells from some manufacturer of 7-12 years ago (manganese-based cells for Nissan Leaf), it was found that charging level of about 3.92 V provides the best balance between two major deterioration mechanisms - build-up of SEI (Solid Electrolyte Interface, which accelerates at high charge levels), and EO (Electrolyte Oxidation). That's where the magic 3.92 V number came from. However, Li-ION cells are under continuous improvement, with different anode-cathode-separator materials and slight modifications of electrolyte chemistry, and the parametric space for exploitation of cells is vast, this magic number might be not universal. More, using 3.9V leads to about 60-65% usable capacity.

For practical purposes, if someone wish to employ 3.92 V charging schema, simple disconnect at 3.92 would lead to additional loss of capacity, since the CV part will be missing. To get proper charge to 3.92V, chargers must be re-programmed for new set of parameters. Most of modern ICs do have the ability to be controlled over I2C interface, or with external hardware biases.


I understand your question. I did something similar in one of my projects some years ago. I wanted to prioritize cell life over runtime, so I just charged the cell to 4.1V and then disconnect the charger. The constant voltage charging phase is what makes you able to have the maximum runtime, so in this case my opinion is that you can go without it.

I actually don't know where the 3.92V come from, but it is not uncommon to use Li-ion cells in a narrower voltage range in order to improve battery life. Some EV manufacturers, for example, start with a narrower voltage range which guarantees the declared vehicle range. Then, as battery ages and loses capacity, the voltage range is progressively extended to compensate and to maintain the original range.