Get your tickets to THE BIG THING 2026!

Didn't I title my last post, "violent agreement"?
I'll buy you the beer, but you gotta buy me a decent Merlot.![]()
Deal, but please do give my last post a sanity check. I know what I meant to write, but we non-EE types don't always get the details right.
So what's the verdict? Can you fully charge a 120AH house battery from the alternator or not?
Absolutely yes. But be prepared for it to take a few days drive time to get it done.
how's the math on that work? If the alternator outputs 120 Amps, how much is the engine drawing just to run? Where is it all going? Math me bro.
Only have a few minutes.
Electricity doesn't exist until you use it. It's only a potential. Hoover Dam has a 2000 megawatt potential. So the water is flowing full speed and the turbines are spinning and you hook up a 40w bulb to it.
How much electricity does it make?
40 watts.
Same with any power supply - battery charger, alternator, whatever. Your alternator can *potentially* produce 120a, but it won't unless there is 120a of load sucking the power out of the alternator. If everything in your truck is only pulling 6a, then your "120a" alternator is producing 6a.
The extra 114a doesn't "go somewhere". It doesn't exist. It's only a potential.
Then I guess I don't understand why the house battery can't suck that power out of the alternator. What is the limiting component? Wires, charge controller, battery?
follow up question, while it may take 12-36 hours of drive time to get to 100%, how variable is the charge rate? Will it, for example, get to 80% in 1 hour, and then take 11 hours for the last 20%? What is the equation that governs the charge rate in the battery?
Didn't I title my last post, "violent agreement"?
-- Electrical circuits are circuits. Agreed. This means that resistance on either leg can cause a voltage drop in the circuit. It is simply easier to imagine that it lies between point A and B.
-- The charging circuit must be wired to carry the full output of the charger over the distance between the charger and the battery. Agreed. (In my particular case, I simply accepted the GM factory wiring and took AM Solar's suggested wiring size for the solar kit.)
The crux of the issue, may be "oversized" vs. "right sized." Here too, I agree.
In my case, I have a 300Ah starter battery bank connected to a pair of alternators with a combined specced output of 250A. I am using the GM factory wiring.
This 300Ah battery bank is connected to a 600Ah camper battery bank, about 20 linear feet away. (The actual cable length is probably closer to 25 feet, one way, call it 40-50feet, round trip.) I want to achieve the highest possible current flow between the two battery banks. Using Chris Gibson's formula as a guide (http://www.smartgauge.co.uk/cable_type.html) If I go for a voltage drop of 0.5, in the circuit, then I need 360mm2 of copper. I simply rounded this down to about 100mm2 which I get by using a pair of AWG 1/0 cables. Using a pair makes it easier to run, especially as each of my starter batteries is on a different side of the truck.
So, my, wiring is not "oversized" but rather only about 1/4 of the textbook size.
But, when you realize that 99% of all dual battery setups run cables between AWG 6 and 10, that is under 15mm2, it is "oversized" compared with industry practice, but, in fact, undersized for the potential amp flow over the distance required.
Some more agreement. A charger will only "see" the battery to which is it most closely connected.
In the case of lead acid batteries, the function is really rather simple. The charger monitors the voltage of the battery and, when it detects a drop, supplies current (amps) and voltage until it raises the voltage to the desired level. The more modern the charger, the more the bells and whistles and the better the charging function. Modern chargers:
-- Use higher voltages than before, as the higher voltage difference, the greater the current flow.
-- Use a multi stage program, typically bulk, absorb, float, to charge fast, charge deep (dissipate surface charge), and maintain.
-- To do this, modern chargers typically incorporate temperature sensing to raise voltage as the weather turns cold and, at the same time, avoid boiling off moisture in the battery. (Especially critical in gel and AGM batteries that cannot be topped up with water.)
They may also respond to specific needs like headlights or windshield wipers. Some also incorporate a shunt to more quickly respond to battery discharge and, in some cases, a remote voltage sensor to compensate for voltage loss in the circuit.
So, how does the truck alternator/regulator side of my system work?
-- Start engine or let the sun rise, battery voltage rises to >13.2v. Relay closes.
-- At this point, the starter batteries and camper batteries are connected. Current flows from the more highly charged battery to the less charged battery.
-- The rate of that flow is determined by the voltage difference and restricted by the resistance of the cabling.
-- Either or both chargers will respond to any voltage drop in their battery(s), by ramping up voltage and current, to the limits of the charger.
So, to go back to the original poster's system, what is the news you can use?
-- You need to know the size of the primary battery and the capacity of your charger.
-- Then you need to know the distance between the batteries. Both under the hood? You can probably use 15 feet, round trip. Back of your truck? You are probably guessing over 40 feet.
This known, you can calculate the size of the cabling required. This chart is probably close enough:
View attachment 238559
Where would I differ with the original poster? Only on this, if you have a solar kit, there are benefits to using an automatic, bidirectional relay, as opposed to a key controlled relay.
I'll buy you the beer, but you gotta buy me a decent Merlot.
[/COLOR]
Once again, I find myself agreeing with dwh. (Perhaps because he does this for a living?)