I am not sure how any of these are arguments in favor of Ah over Wh. I, and most power system designers, actually want to know what the energy efficiency of a storage system is, not what the 'current efficiency' is. For that you need to know how much energy (Wh) you put into the battery while charging, and then how much energy you can get out of the battery (Wh) while discharging. If you are actually trying to make an accurate energy budget (which we often need to do) you have to take into account that you typically charge a battery at ~14.0V, then discharge it at 13.2V - even if you get 99.5% of the Ah back out, you are only getting 94% of the energy back out. For a carefully designed system, this matters.
You are correct that as the battery voltage rises while charging the energy efficiency of the system is going down - using Ah you are ignoring this loss. I don't see this as an advantage. As a follow on to this, as batteries age, their internal resistance increases, which means it takes more energy to charge the battery and there is less energy available from the battery. Using Ah only, you are ignoring this, even though it is a measure of battery performance with time.
Ah are a simplification based on a legacy from when everyone used the same chemistry/voltage, power was hard to measure, and we primarily used linear regulators that were constant current devices. None of these are true anymore.
You are correct that as the battery voltage rises while charging the energy efficiency of the system is going down - using Ah you are ignoring this loss. I don't see this as an advantage. As a follow on to this, as batteries age, their internal resistance increases, which means it takes more energy to charge the battery and there is less energy available from the battery. Using Ah only, you are ignoring this, even though it is a measure of battery performance with time.
Ah are a simplification based on a legacy from when everyone used the same chemistry/voltage, power was hard to measure, and we primarily used linear regulators that were constant current devices. None of these are true anymore.
With good reason! Not just "how its done" or easy calculations.
Amp hours relate to the basic chemical reaction of the battery whereas Watt hours are much more affected by state of charge when charging and discharging and by rate of charge and discharge.
Taking a LFP battery as an example, when new the CURRENT charge to discharge efficiency is about 99.5%. As the battery ages this efficiency INCREASES! i.e. almost all the amps × hours put in can be taken out. BUT the Watt hours put in and Watt hours taken out depend where in the cycle they are put in and how fast they are out in. Watt hours in the early part of the cycle are reasonably efficient but decrease in efficiency as voltage rises.
TL;DR Coulomb counters show amps and amp hours natively because that is more accurate.
It is Watt hours that are derived.
But Wh are good for comparing energy usage over time, between say mains AC load devices and 12Vdc
or the two different sides of an inverter.