That's why you have to taper the voltage down so that you don't continue to force current and overcharge.
My '76 Ford campervan holds a steady 14.5v at any RPM above idle. At idle the voltage drops, normally to around 13.6v, but on the road, 14.5v steady. It doesn't taper voltage, it never has.
The amps do taper off, but that's a natural result of the battery voltage/resistance increasing until it reaches parity with the source.
Modern computer controlled voltage regulation sometimes plays games with the voltage, such as multi-stage charging, temperature compensation voltage adjustment, or undercharging to enable regenerative braking, but not even all modern systems do those things - some simply behave the same way a non-computer controlled system behaves.
I agree that with parallel batteries that one of them should charge but you can't say that once one is charged the other /must/ charge.
I didn't say that. I said that one will charge a little, one a lot. But it's parallel, not serial.
On my truck, after starting the 460 big block, the cranking battery surface charge will hit 14.5v in less than a minute once I put it in gear and drive away. The cranking battery has then reached voltage parity with the source and very little current flows to it. It trickle charges until it can't absorb any more at 14.5v.
The deep cycle takes longer for the surface charge to reach 14.5v, during which time it is bulk charging at the limit of the circuit's resistance. Even though the wiring between the batteries is large, the wire from the alternator to the cranking battery is the factory wire, which I believe is #10 (from the factory, with a factory 100a alternator), so I pretty much never see more than 30a into the house battery, and that tapers off as the battery state of charge rises. Once the house battery reaches 14.5v, I might see 5a or 10a for a few hours, depending in temperature, until the house battery has absorbed all it can at 14.5v and the amps taper off to almost nothing.
An alternator is neither a constant current or constant voltage source.
In terms of battery charging, it is a constant voltage source with current limiting.
Oh true, it's not very good at constant voltage, except for some with fancy PWM regulators, and/or high power alternators that put out more power at idle, but that is essentially what it is. It regulates voltage, but as far as amps go, it just does current limiting at some max value.
Current will only flow if the load impedance is lower than it's equivalent internal impedance and the system voltage must regulate otherwise the main battery would overcharge.
True. But for over 40 years my truck has regulated the system voltage at 14.5v and I generally get around 5-6 years out of a starting battery.
The deep cycle battery never lasts that long, but that's because I deliberately abuse it by ignoring the 50% rule and replace it every couple of years.
My old '67 Bug with a generator instead of an alternator also regulated at 14.5v. I've heard some Toyotas are set to 13.9v or 13.7v or something like that.
The remote battery is going to end up doing whatever it will, a tapering fast down or a trickle charge
Yes, exactly.
depending on what the system sees.
Depending on battery/circuit resistance, not on what the system sees.
The system merely tries to hold a constant voltage, and supplies whatever amps are needed to do that.
Fancier systems can jump around to different voltage set points, or fudge the voltage a bit based on temperature, but they are still just trying to hold a set voltage by supplying enough amps to do it.
The reasons why battery banks are configured with similar batteries and wired as they are remain valid here even though it's just during charging.
Sorry, but no.
Batteries wired into a full-time bank have to be matched and balanced because they charge and discharge together. If one does more work than the others, it will wear out quicker, which will eventually cause some other battery in the bank to end up having to do more work and end up wearing out quicker. Slow motion chain reaction (usually takes years) leading to the premature failure of the entire bank.
But in a part-time (charging only) "RV style" setup (which most of us use), the batteries don't discharge together. They do different jobs and one ends up working a lot harder - by design - but because they are isolated during discharge it doesn't lead to premature failure of the other.
So the reasons for matching/balancing don't apply.
So you need a low enough impedance that the alternator can force current onto the remote battery without the existing battery or normal loads seeing it.
The cranking battery doesn't "see" anything (personally, I try not to use that word, because it gives people an inaccurate picture of what is happening).
It's all about different resistance values. The alternator obviously has the lowest resistance, or it wouldn't work. The cranking battery, being nearly full, has a higher resistance than the partially discharged house battery.
The voltage regulator is the only thing that sees anything. It sees a voltage below where it should be, so it energizes the field coil of the alternator (engages the clutch, basically), and current is supplied. How much current? However much is required to supply the total load demand.
The current follows the path of least resistance, so a little will go into the cranking battery (lower resistance than the alternator, but not by much), somewhat more will go into the house battery (lower resistance than the alternator and the cranking battery), and some into the chassis bus to supply pure consumer loads.
For example, do people with batteries in trailers charging through the plug always achieve full charge? I believe that undercharging is often a problem for them.
It is a common problem, mostly due to the fact that common off the shelf RV converters are usually crap.
They are normally designed to supply 12v loads, not to charge batteries. Many of the lowbuck units are regulated at 12.6v.
Even the good ones, such as Progressive Dynamics, aren't very good battery chargers. The PD bulks to 14.4v, then drops to float at like (IIRC) 13.6v, and then after some time (28 hours? 36? 72? Can't recall offhand.), drops to a lower float voltage of 13.2v.
The problem is the design philosophy. Plugged into shore power most or all of the time, supply 12v loads, maybe keep the batteries topped off - but for gods sakes, don't overcharge and explode the batteries.
They do that fine. Battery charging...not so good.
Now that may be an extreme case with poor connector and small wires, but 20 feet of 16 AWG is still only about 80mΩ so there's not unlimited margin.
Oh true, there are limits where too ridiculously small of a wire can prevent charging altogether. But remember the POTS phone system carries 12v - enough to power a phone - over some pretty small wire and long distances.
More recently, Power over Ethernet (PoE).
Whether the difference in 2 AWG and 1/0 is enough is doubtful, but there's no guarantee that the remote battery must charge either.
No guarantee, true. But it always works.