Help with my multi battery setup

st34lth

New member
Hello all,

I've been considering adding capacity to my vehicle for some time now and finally, the funds are available to do so. For starters, I use the vehicle heavily as a mobile office complete with printer, more than one computer, laptop and screens for my family if they so choose. Not wanting to leave the diesel running constantly but still being able to power our campsite for some time, I have decided on a multi battery setup and would like your thoughts.

The batteries are to be stored in my cap so sealed is a must as I dont want to add venting. I'm looking at 4 of these (http://www.sears.com/diehard-marine-deep-cycle-rv-battery-group-size/p-02827582000P) Die Hard Marine Deep Cycles with 115AH a piece (12v) for a total of 460AH across the bank. That should be plenty to cover me for at least two days of intermittent usage based on my calculations.

The batteries will be in the back so I plan to run some 00 cable to the rear and a BlueSea ACR to mediate the charging of the battery bank. I have a 2kw inverter that will be attached to the bank.

1) Are there other, more efficient, battery options I should consider?

2) Suggestions for an alternative charging relay/isolator?

Any help is greatly appreciated!
 

High_Country

Adventurer
I'm not the most intelligent on the subject, but since you've got no replies yet here's an opinion for what it's worth.

I'm a fan of heavy gauge cable and all that to minimize voltage drop and such, but I think 00 cable might be a bit overkill. If I understand correctly, your main power supply (the supplemental batteries) will be in the bed of the truck near. You'll be running the heavy cable from the vehicle batteries to the supply batteries just to charge the batteries while the vehicle is running (correct?). If that's the case, I'd think 2 gauge cable would be more than enough. You shouldn't be pushing hundreds of amps or anything down this cable and so you shouldn't need that big of a cable.

I mean, I understand 'bigger is better' but given the expense, pliability, and routing and all that of 00 cable, I would go smaller.
 

86tuning

Adventurer
The wire from the alternator would be the limiting factor on charge rate. Most of them are smaller than 10g so going to 00 would be excessive.

Even with heavy draw car stereo setup you won't usually see anything bigger than 4g wire between battery banks
 

dwh

Tail-End Charlie
The batteries are to be stored in my cap so sealed is a must as I dont want to add venting. I'm looking at 4 of these (http://www.sears.com/diehard-marine-deep-cycle-rv-battery-group-size/p-02827582000P) Die Hard Marine Deep Cycles with 115AH a piece (12v) for a total of 460AH across the bank. That should be plenty to cover me for at least two days of intermittent usage based on my calculations.

230ah available if you want to keep the bank above 50% DoD (depth of discharge) for long battery life.


The batteries will be in the back so I plan to run some 00 cable to the rear and a BlueSea ACR to mediate the charging of the battery bank.

You don't really need that size cable for *charging*. There is likely a #10 or a #8 from the alternator to the primary battery anyway, so anything bigger than that won't really help.

The ACR will tie the primary and secondary. The secondary will suck amps from the "12v bus" (alternator + primary battery) until the secondary bank reaches a "surface charge" equal to the 12v bus voltage. After that, it'll mostly be a trickle charge effect.

I know...you're thinking of voltage drop. Voltage drop is very important when sizing cable for *loads*, but not very important when sizing cable for *charging*.

(At least, not so important when sizing cable for charging from a voltage-regulated alternator, which is a "constant voltage" charging system. Cable size is a bit more important when charging from a "constant current" charger, but even then, it's only really important in terms of the max amps the charger can supply, and not so important in terms of voltage drop.)

The batteries are what control the voltage of the bus (until they reach the voltage regulator's bus voltage set point, then the regulator controls the bus voltage).

When the secondary bank is tied in to the primary bus, the voltage of the whole bus (primary and secondary tied together into a single bus) will be drawn down by the secondary battery. The voltage regulator will keep the alternator switched on until the bus voltage reaches the voltage regulator's set point.

Then, the voltage regulator will just keep switching the alternator on and off to hold the bus voltage. But by then, the secondary battery will have a surface charge equal to the bus voltage, and there won't be many amps flowing through that battery (because there won't be much voltage differential, or "electromotive force"), and there won't be enough voltage drop to matter.

The secondary battery's voltage will be constantly trying to drop (and pull down the bus voltage), so the voltage regulator will be constantly switching the alternator on and off to keep bumping the bus voltage back up to where it's supposed to be. But the secondary battery's voltage won't drop much, so again, there won't be any great voltage differential to pump amps through the battery.

Ultimately, as the secondary slowly creeps up toward full charge, the amps flowing to it will be less and less - and any voltage drop will also be less and less - until finally, only an amp or two is flowing, and there isn't any voltage drop anyway.


Voltage drop is important for loads from a battery, as the battery drains and the amp flow increases and so does the voltage drop. It's not very important for charging, where the battery voltage rises, the amp flow decreases and any voltage drop eventually goes away anyway.



I have a 2kw inverter that will be attached to the bank.

Now here you need BigAss(tm) cable!

What is the maximum overload that inverter will handle? Lemme just do a WAG here and say it's rated at 2kw continuous, and can handle up to 3kw surge. Now, factor in the low voltage shutdown set point of the inverter. Let's just say...oh...10.5v.

Okay, so here's the math:

3000w / 10.5v = 286a

For loads, you gotta figure it based on the worst case scenario. So to feed the inverter in my example, you'd need cable (and fuse) rated to handle 300 amps with X voltage drop (say 2%) over Y distance (better not be much...say 3' one direction or a 6' loop).

Now, if you plan to run that (huge, horrible, ginormous, ungodly) load WHILE the engine is running, then yea...you'd want BigAss(tm) cable from the primary to the secondary as well as from secondary to inverter.



1) Are there other, more efficient, battery options I should consider?

Nah.

Lead-acid batteries are not precise electronic components. They are all just big sloppy chemistry experiments in a plastic box. In other words: Junk. Properly cared for, you probably won't kill them too quickly. Personally, I buy CheapAss(tm) batteries, beat the crap out of them and replace them every couple of years. I can't be bothered to pussyfoot around babying some goofy chemistry experiment.


2) Suggestions for an alternative charging relay/isolator?

With that battery bank, you would be far, FAR better off with a good quality DC-DC multi-stage charger. One of these bad boys would do the trick right proper:

http://sterling-power-usa.com/batterytobatterychargers.aspx


No, not cheap. But a hell of a lot cheaper than replacing a quartet of quality AGMs after you kill them by never getting them properly charged from a crappy automotive so-called "charging system".


And keep in mind - even a proper 50a DC-DC charger is gonna take AT LEAST 5 HOURS of engine run-time just to do the bulk stage recharge on your proposed battery bank. PLUS however many hours it will need for the absorb stage.

And that's if the bank is only down to 50% DoD.

A stock voltage-regulated alternator setup is gonna take a WHOLE LOT LONGER to get it done.

So you better figure on buying a bloody good BigAss(tm) shore powered charger as well. A Prosine 2.0 inverter/charger that can do 100a bulk stage from shore power would be best (but again, not cheap):

http://www.xantrex.com/power-products/inverter-chargers/prosine-2.aspx


But hell, ANY decent multi-stage shore power charger like an Iota, Samlex or even a Battery MINDer would do the job. And you WILL need it if you don't want to have to replace that battery bank in a year or two.
 

4x4junkie

Explorer
With that battery bank, you would be far, FAR better off with a good quality DC-DC multi-stage charger. One of these bad boys would do the trick right proper:

http://sterling-power-usa.com/batterytobatterychargers.aspx


No, not cheap. But a hell of a lot cheaper than replacing a quartet of quality AGMs after you kill them by never getting them properly charged from a crappy automotive so-called "charging system".


And keep in mind - even a proper 50a DC-DC charger is gonna take AT LEAST 5 HOURS of engine run-time just to do the bulk stage recharge on your proposed battery bank. PLUS however many hours it will need for the absorb stage.

And that's if the bank is only down to 50% DoD.

A stock voltage-regulated alternator setup is gonna take a WHOLE LOT LONGER to get it done.

I'm not sure I can believe the (inflated?) claim on that page to "Recover your batteries up to five times faster than a standard alternator" when no additional source of power is provided besides said alternator (certainly it doesn't make an alternator produce more current, especially with the simple way it's hooked up, and most alternators these days already put out well over 50 amps on their own). 30-50% faster due to being more precise about it's Absorb charging algorithm and ability to set it for different battery types would make more sense.
To me that looks like a unit more for the purpose of protecting your batteries from overcharging during exceptionally long periods of engine operation, since alternators don't have a "Float" stage themselves and would otherwise be holding your batteries at 14.4V unnecessarily after they've reached a full charge. The OP said he wanted to reduce his amount of engine operation.


My recommendation to the OP is: run a #2 or #0 wire from your main vehicle battery to your house battery bank with the ACR inline (agreed, #00 is overkill here). This should easily get the 70-100 amps most modern vehicle alternators are capable of into the house bank to get a good bulk charge going while the engine runs (if you have dual alts, use the #0).
I will then 2nd the inverter/charger recommendation given above (in place of your inverter-only unit). This makes it nothing more than a matter of plugging the vehicle into shore power (or a generator) for a few hours to overnight for it to have the house batteries 100% charged up & ready to go (or for putting the vehicle into storage). Such units usually have a transfer relay built in so that while it's plugged in, it automatically switches all your circuits directly to the shore power itself while the charger operates. When unplugged from shore power, it automatically switches back to Inverter mode (the switchover times on a good unit such as the Xantrex are fast enough that even sensitive stuff like TVs and computers shouldn't "see" the switch, you can leave them on & running no problem).
 

Coscienceguy

Observer
One thing that maybe should be discussed is that you mentioned you have a diesel. My 99 f350 diesel has 2 batteries under the hood. I am currently researching my own third battery install. While I don't need the amps you do, I am looking at the best method for hooking up an Optima yellow top to my other two batteries under the hood. I am leaning to a sure power industries 120A 1 input 3 output isolator. If your diesel is like mine, is your accessory battery bank going to be charged adequately enough by your stock alternator? Stock on mine is 110A, so I am also exploring the idea of a high output alternator upgrade as well. Any seasoned vets thoughts on this?
 

G35Vortec454

Adventurer
First off, The link you provided for batteries does not explicitly state the batteries are sealed.

Second, as already stated by another poster, the ACR does not isolate you house from your starting battery. I personally like the isolation provided by a high quality battery isolator.

If it takes more than 5 hours to bulk charge your batteries, your charging source is undersized. Upgrade to 100A or 120A charging capacity, using a quality charger or a high output alternator.

I disagree with other posters on cable size. Your 2k inverter and 100A or 120A charger woud benefit from the 00 or even the max 4/0 size. Remember that your system will not be oerating optimally half the time and big cables help comensate for it.

Here's my mobile power system, with a 3KW pure sine inverter powered by 3 battery banks, 200A alternator, 4KW Onan generator, and 450W solar panels.
asonico-albums-rv-picture19690-mpower30.jpg


asonico-albums-rv-picture18618-rv2a.jpg
 
Last edited:

dwh

Tail-End Charlie
I'm not sure I can believe the (inflated?) claim on that page to "Recover your batteries up to five times faster than a standard alternator" when no additional source of power is provided besides said alternator (certainly it doesn't make an alternator produce more current, especially with the simple way it's hooked up, and most alternators these days already put out well over 50 amps on their own). 30-50% faster due to being more precise about it's Absorb charging algorithm and ability to set it for different battery types would make more sense.

Well, for one thing, it does a constant current bulk stage. Normal voltage-regulated alternator charging systems are constant voltage, not CC.

For another, it up-converts the voltage. It can bulk as high as 14.8v (programmable), as long as the voltage on the primary side is at least 13v. (If the primary side is less than 13v, it just puts itself in sleep mode.)

It's also temperature compensated.


It certainly *does* make the alternator produce more current.

In normal operation, a voltage-regulated alternator will produce only enough current to supply whatever the loads on the bus are, plus whatever can flow through the battery (which is just another load on the bus anyway). The voltage regulator will switch off the alternator when the bus voltage reaches the set point, and switch it back on when the bus voltage drops.

The unit in the link will put a load on the primary bus to drag down the bus voltage to 13v and keep it down, forcing the voltage regulator to keep the alternator energized, forcing the alternator to put out power (current) at 100% duty cycle.

It doesn't make the alternator produce any higher *maximum* current, but it does force it to produce current 100% of the time, which is "more current". It's also got an alternator temp sensor to make sure it doesn't overheat the alternator.



To me that looks like a unit more for the purpose of protecting your batteries from overcharging during exceptionally long periods of engine operation, since alternators don't have a "Float" stage themselves and would otherwise be holding your batteries at 14.4V unnecessarily after they've reached a full charge.

That's not usually an issue.

Most voltage regulators diddle the field coil to keep the voltage between 13.5v and 14.4v. The target is usually an *average* bus voltage of 13.8v, which is just a somewhat high float voltage for most batteries.

But even if the voltage regulator held the bus voltage to 14.4v all the time (there are some PWM voltage regulators which I think can do this), it doesn't much matter, since most batteries can handle it for some hours.

For instance, an Iota charger with IQ/4 control module has an 8 hour timer on the absorb stage. Most batteries can handle that. In normal use, it would be rare for someone to exceed that by driving (though, sure - it can be done.) This is also the reason absorb time-out is not really an issue with solar - the absorb stage is going to end when the sun goes down no matter what.




Such units usually have a transfer relay built in so that while it's plugged in, it automatically switches all your circuits directly to the shore power itself while the charger operates. When unplugged from shore power, it automatically switches back to Inverter mode (the switchover times on a good unit such as the Xantrex are fast enough that even sensitive stuff like TVs and computers shouldn't "see" the switch, you can leave them on & running no problem).

Yea. Once nice feature of the newer Prosine units is that they can be programmed to not exceed the shore power's potential (say, for instance it's plugged into a 15a shore power circuit) by drawing from the battery bank to supplement the shore power when the load exceeds what the shore power can supply.
 

G35Vortec454

Adventurer
Well, for one thing, it does a constant current bulk stage. Normal voltage-regulated alternator charging systems are constant voltage, not CC.

For another, it up-converts the voltage. It can bulk as high as 14.8v (programmable), as long as the voltage on the primary side is at least 13v. (If the primary side is less than 13v, it just puts itself in sleep mode.)

It's also temperature compensated.


It certainly *does* make the alternator produce more current.

In normal operation, a voltage-regulated alternator will produce only enough current to supply whatever the loads on the bus are, plus whatever can flow through the battery (which is just another load on the bus anyway). The voltage regulator will switch off the alternator when the bus voltage reaches the set point, and switch it back on when the bus voltage drops.

The unit in the link will put a load on the primary bus to drag down the bus voltage to 13v and keep it down, forcing the voltage regulator to keep the alternator energized, forcing the alternator to put out power (current) at 100% duty cycle.

It doesn't make the alternator produce any higher *maximum* current, but it does force it to produce current 100% of the time, which is "more current". It's also got an alternator temp sensor to make sure it doesn't overheat the alternator.





That's not usually an issue.

Most voltage regulators diddle the field coil to keep the voltage between 13.5v and 14.4v. The target is usually an *average* bus voltage of 13.8v, which is just a somewhat high float voltage for most batteries.

But even if the voltage regulator held the bus voltage to 14.4v all the time (there are some PWM voltage regulators which I think can do this), it doesn't much matter, since most batteries can handle it for some hours.

For instance, an Iota charger with IQ/4 control module has an 8 hour timer on the absorb stage. Most batteries can handle that. In normal use, it would be rare for someone to exceed that by driving (though, sure - it can be done.) This is also the reason absorb time-out is not really an issue with solar - the absorb stage is going to end when the sun goes down no matter what.






Yea. Once nice feature of the newer Prosine units is that they can be programmed to not exceed the shore power's potential (say, for instance it's plugged into a 15a shore power circuit) by drawing from the battery bank to supplement the shore power when the load exceeds what the shore power can supply.

Excellent explanation!

Yeah, load-sharing is a nice-to-have inverter feature.
 

4x4junkie

Explorer
Well, for one thing, it does a constant current bulk stage. Normal voltage-regulated alternator charging systems are constant voltage, not CC.
Sure they are. When said system is loaded down by a discharged battery, it constantly dumps it's maximum current into the battery (the alternator section in my Ford's factory service book actually states: "The alternator is self-current limiting"). On most modern alternators, this is FAR more than the 50 amps that DC-DC unit is rated at.

For another, it up-converts the voltage. It can bulk as high as 14.8v (programmable), as long as the voltage on the primary side is at least 13v. (If the primary side is less than 13v, it just puts itself in sleep mode.)

It's also temperature compensated.

I did say it was more precise about it... :)
And alternators are temp-compensated too (at least Ford's 3G ones are, can't swear about all others). But due to engine heat and other environmental factors, no doubt they are not as precise as a dedicated electronic charger would be.

It certainly *does* make the alternator produce more current.

In normal operation, a voltage-regulated alternator will produce only enough current to supply whatever the loads on the bus are, plus whatever can flow through the battery (which is just another load on the bus anyway). The voltage regulator will switch off the alternator when the bus voltage reaches the set point, and switch it back on when the bus voltage drops.

The unit in the link will put a load on the primary bus to drag down the bus voltage to 13v and keep it down, forcing the voltage regulator to keep the alternator energized, forcing the alternator to put out power (current) at 100% duty cycle.

It doesn't make the alternator produce any higher *maximum* current, but it does force it to produce current 100% of the time, which is "more current". It's also got an alternator temp sensor to make sure it doesn't overheat the alternator.

If the bus voltage is to reach it's set point (and the alternator switch off or reduce it's output), a battery connected to it would have to be at something fairly close to full-charge. If the battery is significantly discharged, then that battery will not allow the voltage to reach the set point = alternator is pumping out it's maximum current. Once the battery comes up and is no longer dragging the voltage below the regulator's set point, then the alt will start cutting back it's power (holding the voltage steady... effectively it is running in Absorption mode).
With that DC-DC unit's 50-amp maximum, I don't see how that can save any time over the alternator pumping it's 80-100+ amp maximum into the battery(ies) on it's own.


That unit clearly is made for a marine application, which a large vessel quite very often would see run times well-exceeding the 8 hours you mentioned, so yea, there's a market for them in marine use, but for most land vehicles I think it is over-complicating things (especially in this case since [again] the OP here said he wanted to reduce the amount he's running the engine).





Most voltage regulators diddle the field coil to keep the voltage between 13.5v and 14.4v. The target is usually an *average* bus voltage of 13.8v, which is just a somewhat high float voltage for most batteries.

But even if the voltage regulator held the bus voltage to 14.4v all the time (there are some PWM voltage regulators which I think can do this), it doesn't much matter, since most batteries can handle it for some hours.

For instance, an Iota charger with IQ/4 control module has an 8 hour timer on the absorb stage. Most batteries can handle that. In normal use, it would be rare for someone to exceed that by driving (though, sure - it can be done.) This is also the reason absorb time-out is not really an issue with solar - the absorb stage is going to end when the sun goes down no matter what.

13.5-13.8V isn't near enough to fully recover a vehicle's battery in a reasonable amount of run time (a battery probably wouldn't last 3 years in a car being charged that little).
Again, I can't comment as to what the exact specs are on all vehicles, but every one I've owned has (had) no problem whatsoever putting out enough to properly charge a battery (and the battery have at least a 5-year service life, usually far longer).
I will say a mass-remanufactured alternator I bought one time (won't ever do that again) did have some issues with insufficient voltage output, but it was still above 13.8 (more like 14.0 or so). So perhaps these vary a bit (might've had a chinese voltage regulator in it :rolleyes: It also fluctuated a lot with varying engine RPM).
 
Last edited:
I use dual Odyssey PC1500 batteries with a nation luna split charge kit. The controller lets you switch between the batteries in parallel or in series without having to rewire anything. The aux circuit outputs to a disco fuse box and everything runs off that. Both circuits have marine isolator switches for safety and means you can run batteries in series safely.

An expensive way to do it, if your tech savy you could just buy the components and save a fair bit of cash. The kit comes with a solenoid that routes charge to which ever battery needs charging. The battery monitor is great as it gives you info on the state of your alternator and batteries.

Dave
 

dwh

Tail-End Charlie
Sure they are. When said system is loaded down by a discharged battery, it constantly dumps it's maximum current into the battery

No, it doesn't. It dumps however much can overcome the resistance of the battery and the wiring (and the frame if that's what's being used for the ground), at whatever the bus voltage is.


For example:

http://aimpartsonline.com/faq

"Q: My charging system spec's are 14.5 volts. Why is my system running at 13.0 volts?

A: There is a load being applied to system. When amp's go up volts go down.

Example: A 105amp alternator with a 60amp load will run at 13.0 volts."


So how much can be expected to flow through the battery at that voltage?

Here's the math to figure that:

http://www.smartgauge.co.uk/nosurge2.html


But, even if the alternator did dump it's full rated output to the battery, it would likely void the warranty of just about any deep cycle battery. Most deep cycle batteries have a max charge current rating of C/4 (amp hour Capacity divided by four). So for a typical 105ah deep cycle battery, that would be 25a.

Some batteries can handle more. Odysseys are crazy - they can take C*4. Optimas have very low resistance, and they spec no current limit in cyclic charging.



(the alternator section in my Ford's factory service book actually states: "The alternator is self-current limiting"). On most modern alternators, this is FAR more than the 50 amps that DC-DC unit is rated at.

Current limited is just that - the limit. That doesn't mean the alternator is actually going to put out that much - it will only put out as much as is drawn from the bus. Alternators don't push - everything else sucks. :)

But again, even if it did put out it's full rated power, most alternators aren't rated for 100% duty cycle, so it's probably going to overheat.


Except it won't. Because if you dump a ton of amps into a battery, you're going to hit surface charge in a hurry, and then the voltage differential is almost nothing, so the battery is going to charge slowly anyway.

http://www.smartgauge.co.uk/surf_chg.html

"If we try to charge the batteries too fast, with a large charger, then they will not become fully charged. Only the surfaces will be charged. Deep inside the plates they will still be in a low state of charge. This causes the "insides" of the plates to sulphate up (a phenomenon usually associated with the surface of the plates)."

And:

"A full, timed, acceptance cycle is the way to cure this problem. Irrespective of what the acceptance charge current has dropped to, the acceptance cycle should continue for a certain period and most chargers do not continue this cycle long enough. Most (there were a few exceptions) older chargers (say from the 1980s and 1990s) seem to have acceptance cycles of around 1 to 3 hours. This really isn't long enough. Many modern chargers now run the acceptance cycle for much longer periods. 8 hours is becoming common. This is much more satisfactory and does ensure that the last remaining 10 percent or so of charge is put into the batteries."


(He speaks the Queen's English - he's saying acceptance cycle, where we'd say absorb stage.)

(especially in this case since [again] the OP here said he wanted to reduce the amount he's running the engine).

Which is why he needs a proper charger. Automotive "charging systems" aren't very good battery chargers.

Oh...they can get the job done. But they take a long time to do it.



13.5-13.8V isn't near enough to fully recover a vehicle's battery in a reasonable amount of run time (a battery probably wouldn't last 3 years in a car being charged that little).

Sure it is. A fully charged 12v battery resting voltage is 12.8v, so 13.8v is plenty to get the battery to that point.

As for time - think of what a normal alternator is designed to do (in regards to battery charging)...

Say the starter draws 250a. Say it runs for 3 seconds.
That would be 250a in one hour or, or 250ah.

250ah / 60 minutes = 4.16ah per minute
4.16ah / 60 seconds = .07ah per second
.07ah * 3 seconds = .21ah

So, the typical automotive "charging system" is designed to deliver about 1/5 of an amp hour every time the battery gets depleted by starting the vehicle. (Actually, probably a lot less, since I doubt many starters actually draw the 250a I used in my example.)

How long is that gonna take? Not long. Couple of minutes maybe?
 

4x4junkie

Explorer
No, it doesn't. It dumps however much can overcome the resistance of the battery and the wiring (and the frame if that's what's being used for the ground), at whatever the bus voltage is.


For example:

http://aimpartsonline.com/faq

"Q: My charging system spec's are 14.5 volts. Why is my system running at 13.0 volts?

A: There is a load being applied to system. When amp's go up volts go down.

Example: A 105amp alternator with a 60amp load will run at 13.0 volts."


So how much can be expected to flow through the battery at that voltage?

Here's the math to figure that:

http://www.smartgauge.co.uk/nosurge2.html


But, even if the alternator did dump it's full rated output to the battery, it would likely void the warranty of just about any deep cycle battery. Most deep cycle batteries have a max charge current rating of C/4 (amp hour Capacity divided by four). So for a typical 105ah deep cycle battery, that would be 25a.

Some batteries can handle more. Odysseys are crazy - they can take C*4. Optimas have very low resistance, and they spec no current limit in cyclic charging.

Seems you missed this little tid-bit right underneath what you quoted from that:
"Remember a under-charged battery is also a load."

Also, it seems you may not realize that the statement about a 105A alt putting out 60A was just an example given to answer a very general question (and is not even typical). Performance like that might be common at an idle, but a 105A-rated alt more typically puts out about 80A at engine RPMs common while driving. And any battery that is sufficiently discharged that it prevents the alternator from reaching it's voltage set point is obviously doing so by taking the full brunt of it's output minus whatever the vehicle's systems are using.

That one about surge currents is not relevant here. We are talking about an alternator charging a battery, not connecting charged & discharged batteries together. A discharged battery with 11.8V on it might come up to something around 12.6V when you first put a significant charge current to it. Well below an alternator's 14.2-14.4V set point = alternator charges battery.

And if you recall, the OP had proposed using four 115Ah Die-Hard deep-cycle batteries, so there should be no risk whatsoever with his alternator exceeding a C/4 charge rate on them unless he's got some crazy alternator setup (a 105A-rated alt with it's typical 80A charge rate would be hitting them at well under C/5 max). Besides, these are dual-use starting/deep-cycle batteries, which I do believe are a little more robust in regard to charge current than a standard deep-cycle-only battery would be.



Current limited is just that - the limit. That doesn't mean the alternator is actually going to put out that much - it will only put out as much as is drawn from the bus. Alternators don't push - everything else sucks. :)

But again, even if it did put out it's full rated power, most alternators aren't rated for 100% duty cycle, so it's probably going to overheat.
No disagreement there. If the battery isn't discharged enough to suck down the bus voltage, the alternator will reach it's set point and simply hold it there.

Early alternators did commonly have overheating issues (I've heard it said often that if you ever kill your car's battery by leaving the lights on overnight, charge the battery with a charger until it's fully recharged, don't just jump-start the car. Reason for this was the deeply-discharged battery sucks so much power from the alternator it could burn it out).
Modern alternators don't have this issue near as much. They are self-current-limited to a point where the design can handle it (just as stated in my vehicle's manual). They typically have their fans mounted within an internal housing which allows it to draw more cooling air through it's windings and the rectifier than an older external pulley-mounted fan could.


Except it won't. Because if you dump a ton of amps into a battery, you're going to hit surface charge in a hurry, and then the voltage differential is almost nothing, so the battery is going to charge slowly anyway.

http://www.smartgauge.co.uk/surf_chg.html

"If we try to charge the batteries too fast, with a large charger, then they will not become fully charged. Only the surfaces will be charged. Deep inside the plates they will still be in a low state of charge. This causes the "insides" of the plates to sulphate up (a phenomenon usually associated with the surface of the plates)."

And:

"A full, timed, acceptance cycle is the way to cure this problem. Irrespective of what the acceptance charge current has dropped to, the acceptance cycle should continue for a certain period and most chargers do not continue this cycle long enough. Most (there were a few exceptions) older chargers (say from the 1980s and 1990s) seem to have acceptance cycles of around 1 to 3 hours. This really isn't long enough. Many modern chargers now run the acceptance cycle for much longer periods. 8 hours is becoming common. This is much more satisfactory and does ensure that the last remaining 10 percent or so of charge is put into the batteries."


(He speaks the Queen's English - he's saying acceptance cycle, where we'd say absorb stage.)



Which is why he needs a proper charger. Automotive "charging systems" aren't very good battery chargers.

Oh...they can get the job done. But they take a long time to do it.
Again, I didn't argue that one bit at all... That is part of why I mentioned the Inverter/Charger unit if the OP wants to fully recharge his batteries say, to store the vehicle (all one has to do is plug it in overnight, done. Batteries fully charged).
I guess if one was really overly concerned about it, then they could make use of the DC-DC charger to top off the batteries before they go to bed at night... but they would have to run the engine to do so, which for the THIRD time now, the OP said he did not want to do that (plus if there's anything left on like a fan running so one can stay cool at night, that defeats the whole purpose of it anyway).


Sure it is. A fully charged 12v battery resting voltage is 12.8v, so 13.8v is plenty to get the battery to that point.

As for time - think of what a normal alternator is designed to do (in regards to battery charging)...

Say the starter draws 250a. Say it runs for 3 seconds.
That would be 250a in one hour or, or 250ah.

250ah / 60 minutes = 4.16ah per minute
4.16ah / 60 seconds = .07ah per second
.07ah * 3 seconds = .21ah

So, the typical automotive "charging system" is designed to deliver about 1/5 of an amp hour every time the battery gets depleted by starting the vehicle. (Actually, probably a lot less, since I doubt many starters actually draw the 250a I used in my example.)

How long is that gonna take? Not long. Couple of minutes maybe?

So then what is the whole purpose of chargers that are equipped with an "absorption" stage that lifts the battery's voltage to 14.2-14.4 volts if it's to accept a full recharge in any reasonable amount of time? By your statement, such units must be a compete waste of money and that all you need to do is float one at 13.8V??

Sure, I guess if you connect 13.8V to one, it'll probably eventually reach something close to full-charge...- after many hours... Who drives their car that long every time they start it? Typical is about 10-30 minutes each time, sometimes even less (besides, I was speaking of a deep-cycle that had been discharged, not the car's starting battery (dunno how we got there), you'll almost never fully recharge a discharged deep-cycle at 13.8V (forget about at 13.5).
 

dwh

Tail-End Charlie
Seems you missed this little tid-bit right underneath what you quoted from that:
"Remember a under-charged battery is also a load."

Since I've made that statement myself here a couple of times recently - I think it's probably safe to assume that I already knew that.

So, what's your point?



Also, it seems you may not realize that the statement about a 105A alt putting out 60A was just an example given to answer a very general question (and is not even typical). Performance like that might be common at an idle, but a 105A-rated alt more typically puts out about 80A at engine RPMs common while driving.

No, they are typically *rated* to do that, but they don't unless there is 80a of load.



And any battery that is sufficiently discharged that it prevents the alternator from reaching it's voltage set point is obviously doing so by taking the full brunt of it's output minus whatever the vehicle's systems are using.

No, there is no full brunt. The alternator makes only as much power as is being drawn from the bus. If the battery can only accept 20a due to its resistance, then even a million amp alternator is only going to produce 20a.

Hoover Dam has a 2000 megawatt potential capacity, but if there is nothing hooked up but a 20w light bulb, than 20w is all the power that is produced. The rest is just "unused potential".


That one about surge currents is not relevant here.

It's absolutely relevant, because it shows how to figure the amount of current which *can* flow through a battery at X voltage.

Also, a cranking battery rated at 650 CCA can supply more maximum amps than a 100a alternator.

So if there's no giant amp flow from a 650a battery, there certainly won't be any more from a 100a alternator.


And if you recall, the OP had proposed using four 115Ah Die-Hard deep-cycle batteries, so there should be no risk whatsoever with his alternator exceeding a C/4 charge rate on them unless he's got some crazy alternator setup (a 105A-rated alt with it's typical 80A charge rate would be hitting them at well under C/5 max).

Of course, I recall his proposed bank size. If you'll read what I wrote, I specifically mentioned, "So for a typical 105ah deep cycle battery, that would be 25a." I was pointing out that *if* a 100a alternator actually dumped 100a, then that would be too much for a typical battery.

And I agree; with 4 deep cycle batteries and a 100a alternator, he's not going to exceed the charge rate *on that battery bank*. Even if the alternator put out 100a, which most of the time - it won't.

Because the battery bank is going to hit surface charge (relatively) quickly and then the amp flow is going to be minimal anyway because the voltage-regulated alternator is a constant voltage charging system, not a constant current charging system.

Which is why, with that size bank, he should be using a 3-stage charger that can do a constant current bulk stage.


Besides, these are dual-use starting/deep-cycle batteries, which I do believe are a little more robust in regard to charge current than a standard deep-cycle-only battery would be.

I believe the Diehard Platinums are supposed to be re-badged Odysseys. And I did specifically mention the charge current rating of Odysseys.


I guess if one was really overly concerned about it, then they could make use of the DC-DC charger to top off the batteries before they go to bed at night... but they would have to run the engine to do so, which for the THIRD time now, the OP said he did not want to do that (plus if there's anything left on like a fan running so one can stay cool at night, that defeats the whole purpose of it anyway).

Yea, you DO keep harping on that. The problem is this - TANSTAAFL.

As I've said already: If he wants to minimize engine run-time, he's gonna need a proper charger because a crappy stock automotive charging system isn't gonna get the job done properly on a 400ah bank bank in a short time. If he depends on his voltage-regulated alternator and short engine run times, he's gonna kill an expensive battery bank.

And no, the DC-DC 3-stage isn't going to magically get it done in any sort of "short time" either. But it'll get it done in a "shorter" time, and that's what is meant by "minimizing run-time". Which, I also already mentioned.



So then what is the whole purpose of chargers that are equipped with an "absorption" stage that lifts the battery's voltage to 14.2-14.4 volts if it's to accept a full recharge in any reasonable amount of time? By your statement, such units must be a compete waste of money and that all you need to do is float one at 13.8V??

Sure, I guess if you connect 13.8V to one, it'll probably eventually reach something close to full-charge...- after many hours... Who drives their car that long every time they start it? Typical is about 10-30 minutes each time, sometimes even less (besides, I was speaking of a deep-cycle that had been discharged, not the car's starting battery (dunno how we got there), you'll almost never fully recharge a discharged deep-cycle at 13.8V (forget about at 13.5).

In that example I was specifically showing that a *cranking battery* typically never does get drained very much (10th of a percent maybe). So recharging it is trivial. Takes a few minutes to recharge every time you start the engine.

And *that* is what a typical automotive "charging system" is designed to cope with.



But to answer your question: What makes you think that an absorb stage is intended to save time? It surely isn't. It's the constant current bulk stage that saves time, not the absorb stage. If you're gonna have to simmer the battery anywhere from 4-12 hours anyway, you want to get it bulked up and switch to absorb as quickly as you can.
 

Forum statistics

Threads
186,849
Messages
2,888,706
Members
227,377
Latest member
blkcad
Top