Wiring solar to aux fuse block?

dwh

Tail-End Charlie
And one final thought (okay, I admit it...I had a nice cup of a joe a while ago...):

You are using a SunSaver charge controller. Factory programmed to bulk stage up to 14.4v.

HandyBob recommends taking batteries up to 14.8v. I totally agree with him.

But will your charge controller, programmed to only go to 14.4v not get your battery "fully" charged?

Sure it will. But it'll need a longer time in the absorb stage to get the electrolyte fully 100% saturated than it would if the charge controller bulked it to 14.8v before dropping into absorb.

In other words: Even if there was a .4v drop - it won't matter, because even with the drop, it's still a high enough voltage to get the electrolyte fully saturated...as long as it has enough time to get it done.
 

CaliMobber

Adventurer
Maybe were are just saying the same things and our minds are looking at it differently. In a solar setup we dont have the time to slowly eventually top off with the lower wire so its looked at as not fully charging since its not being able to do it in the time allotted.

My explanation was wrong

yea 14.4 can fully charge buts its all on temperature and time. It has a maximum of 15v with temp compensation with mine will be off when its hot out since mine is in my car not under the hood with the batteries.

I wish I had some coffee right now :sombrero:.
 
Last edited:

DiploStrat

Expedition Leader
But, Time Matters

Using your calculator, which measures about the same as my tables, I come up with these numbers:

Alternator Output: 250A

Distance: 20 ft (My batteries are way in the back.)

Cable: 2x1/0 (100mm2)

Drop = 0.49v


As expected, drop the charge rate to 150A and your voltage drop drops to: 0.29v, a lot more bearable.

But compare these numbers with a 6 AWG wire, a commonly used size.

Alternator Output: 150A (250A would make it worse)

Distance: 20 ft

Cable: 6 AWG (13 mm2)

Drop = 2.3v

A 2v drop is going to hurt.

It is completely true that as the battery's voltage rises, its ability to accept current drops, so even with a 250A alternator, my charge rate doesn't spend many hours above 150A. The point is that engine run times are typically limited and so you want to make as many amps as possible available as fast as possible, thus when wiring up your engine alternator to your camper battery, you want to use heavy cables. If you ignore the AWG numbers and simply look at the size of the wire in mm2, it becomes easier to see the difference.

There are, of course, many variables and one of the most important is the length of the cable run. Mine may be extreme; stick both batteries under the hood, under ten feet apart, and you can get by with a much smaller cable.

Solar chargers typically have lower charge rates as measured in amps, so, the cables can be much smaller, but you still want to minimize losses.

Perhaps another way to put this is that you should size your wires to be able to handle the largest possible load, over the distance, with the minimal acceptable voltage drop.
 
Last edited:

dwh

Tail-End Charlie
In a solar setup we dont have the time to slowly eventually top off with the lower wire so its looked at as not fully charging since its not being able to do it in the time allotted.

True. So with solar there *may be* a reason to run big wire to shave some time off the absorb stage. It depends. If you have to get the battery fully topped up within a small time window of good sun, then it's worth it to run the numbers and maybe try to squeeze out every watt. You would certainly want to optimize as much as possible with an off-grid cabin setup, or a long-term boondocking setup (such as HandyBob's), which is actually the same setup anyway.

But if you run the numbers and find that having a larger wire is only going to shave 10 minutes off a charge cycle that is going to take 6 or 8 hours to complete anyway, it's probably not worth even worrying about.


I was just making the point that voltage drop through a wire won't cause the battery to end up at a lower voltage.

Now voltage drop through a diode - such as the common battery isolators with heat sinks sold at the auto parts stores, that *can* make the battery end up a half a volt low (or whatever, depends on the diode), but whether it does or not depends on if the voltage regulator is sampling the voltage ahead of the diode (at the alternator) or after the diode (at the battery).


Mmmm, coffee. :)
 

dwh

Tail-End Charlie
Using your calculator, which measures about the same as my tables, I come up with these numbers:

Alternator Output: 250A

Distance: 20 ft (My batteries are way in the back.)

Cable: 2x1/0 (100mm2)

Drop = 0.49v


As expected, drop the charge rate to 150A and your voltage drop drops to: 0.29v, a lot more bearable.

But compare these numbers with a 6 AWG wire, a commonly used size.

Alternator Output: 150A (250A would make it worse)

Distance: 20 ft

Cable: 6 AWG (13 mm2)

Drop = 2.3v

A 2v drop is going to hurt.


It would seem so, but how much would it hurt? How much extra time would it add to the charge cycle?

In your case of course, it's probably going to hurt a lot, because you are up in the > 100a range.


Solar chargers typically have lower charge rates as measured in amps, so, the cables can be much smaller, but you still want to minimize losses.

Yea, maybe.

In this thread we're dealing with a SunSaver 10 and a SunSaver 15. So even using #14 wire, as I did in my example, the voltage drop is less than half a volt at max load of 15a, and it goes down from there, so in terms of "minimizing losses" it's pretty much irrelevant. There really isn't much to work with in terms of optimizing.



Perhaps another way to put this is that you should size your wires to be able to handle the largest possible load, over the distance, with the minimal acceptable voltage drop.

That's absolutely correct - for running a load like an inverter.

But for charging, if the wire is big enough to handle the full load, then, in my opinion, the voltage drop isn't going to be an issue.

And, as we've discussed in another thread, I actually think your wire is too small. :)
 

DiploStrat

Expedition Leader
Size Matters

It would seem so, but how much would it hurt? How much extra time would it add to the charge cycle?

In your case of course, it's probably going to hurt a lot, because you are up in the > 100a range.

I'm in the Tiger now, with the weather in the teens. Keeping the heat at about 60F overnight takes about 50Ah. On a 600Ah battery bank, my shore power charger (Magnum 2800) won't even go into bulk charge, only acceptance. So about the highest charge I can see there is 50A and that drops into the 20A range in minutes.

Solar is iffy, but when the sun peeks between the clouds I can get about 5 - 8 A off the roof with 500w array. Do get nice MPPT benefit in the cold! :)

That's absolutely correct - for running a load like an inverter.

But for charging, if the wire is big enough to handle the full load, then, in my opinion, the voltage drop isn't going to be an issue.

And, as we've discussed in another thread, I actually think your wire is too small. :)

Full load of what? I would size the wire to the full output of my charge source, no?

It would also seem that voltage drop (OK circuit loss) matters less if your final voltage stays above your target charging voltage. That is, as long as your charge source is at a higher voltage than your battery, it will charge. But if the voltage drop takes you below the required voltage, say to a value below 14v, then you will get little or no charge, no matter how long you wait. In my case, I have three chargers and each has a slightly different idea of "charged." (typically a combination of voltage and amp flow) Specifically, my Magnum will go to float (13.xv) much sooner than the Blue Sky controller. If it is cloudy and I have to depend on the Magnum, it will float all night and the state of charge will not change much or at all.

And yes, doing it all over, I would have probably gone with a pair of 2/0 cables, but, to make your point, even with a 250A alternator, you don't stay at a charge rate of 250A very long and then only when the batteries are in a position to take a maximum charge, typically 50% discharge. So given my usage profile of 125 - 150Ah consumption overnight, my recharge will only stay at the 150A level for about an hour and drop rapidly thereafter. And that brings us back to the great, under underappreciated value of solar (or shore) charger - time. It takes a long time for the surface charge to penetrate the batteries.
 

dwh

Tail-End Charlie
Full load of what? I would size the wire to the full output of my charge source, no?

Aye, that's my way of looking at it.

So your alternator setup would need wire sized for 250a - the expected full load - which is going to pretty much negate any...um...negative influence (was that what they call a double negative? :) ) from voltage drop. Same with using #14 wire in my example of a 15a charger. The wire was sized for the load, and the voltage drop was not enough to worry about.
 

DiploStrat

Expedition Leader
Real World

If we say that 100 mm2 (2 x 1/0 or 4/0) of wire over 20 feet (40 foot round trip), is going to have about 0.5v of drop, then the charge voltage becomes very important. If the alternator is putting out 14.5v or more, then I am still getting a charge of 14v or more, which is pretty good. But, if the charge rate is lower, then the charge rate will drop below 14v and things are going to take a lot longer, depending on the state of discharge. How much longer? I don't have the setup to test.

Fortunately, as you have posted repeatedly, once the amp flow drops, even to 150A, then the voltage drop is less than 0.3v and when you drop below 50A, the drop is less than 0.1v. So reasonably sized wire can reduce voltage drop to almost nothing in the final acceptance/absorb stage.
 

dwh

Tail-End Charlie
After sleeping on it, I had a feeling that I screwed this up somewhere. Then I forgot about it and only just now remembered to look at it again. And yup, I see that I totally screwed this up. Yes, it happens. :D What I did wrong I did the first calc using 1a load and 14.8v. That's fine. Where I screwed it up is when I did the second calc - I dropped the voltage to 10.5v, but FAILED to crank the amperage up to 15a - it was still at 1a.

Whoops.

It doesn't really matter, the end result is the same - by the time the voltage of the battery rises to 14.8v, the voltage drop is down to almost nothing.

But still, I'm a picky bastard, so here are the CORRECT numbers for the second calculation:

Voltage drop: 0.76
Voltage drop percentage: 7.24%
Voltage at the end: 9.74

I'll highlight the errors in bold and put the corrections after the bold.


Go to your favorite voltage drop calculator, and put in some wire size and distance, and then keep lowering the amp number. Watch as the voltage drop goes away. Here's the one at Calculator.net. I've already put in the relevant numbers - 14.8v, 15a [should say 1a since that's what I used in the calculation], #14 wire, 20' loop:

http://www.calculator.net/voltage-d...ance=10&distanceunit=feet&amperes=1&x=90&y=10

The result is:

Voltage drop: 0.051
Voltage drop percentage: 0.34%
Voltage at the end: 14.749

Now change the voltage to 10.5v, which is where the voltage would be of a fully dead battery, and the result is:

Voltage drop: 0.051
Voltage drop percentage: 0.49%
Voltage at the end: 10.449

[those numbers are correct, but using only a 1a load in the calculator, the correct numbers with a 15a load are at the top]


So even using only #14 wire, you're going to have around a half a percent drop at 15 amps [should say just over 7 percent drop at 15 amps], and about a third of a percent drop at 1 amp. Which means that by the time the battery reaches full, there won't be enough voltage drop to matter. It doesn't matter what the drop would theoretically be at full load - because the loop voltage is being regulated by the battery and the charger is going to keep charging until it sees 14.8v on the loop.


Now change the wire size to #6. At 14.8v:

Voltage drop: 0.0079
Voltage drop percentage: 0.053%
Voltage at the end: 14.7921


At 10.5v:

Voltage drop: 0.0079
Voltage drop percentage: 0.075%
Voltage at the end: 10.4921


Sure, there's a lot less voltage drop with #6.
But there wasn't enough to matter even with #14. A 1/3 of a percent?

That is not "hugely" important.


So yea, 15a charger, #14 wire and a 20' loop. If the battery is fully dead at 10.5v and the charger is pushing the full 15a, then the voltage drop would be 7%. Doesn't matter though, because the battery is still below the voltage potential of the charger, so it's going to charge. And that 7% is still going to go away as the battery voltage rises.
 

dwh

Tail-End Charlie
If the alternator is putting out 14.5v or more, then I am still getting a charge of 14v or more, which is pretty good.

Well...not exactly. The alternator has the potential to put out 14.5v, but it won't. It will be putting out power, but the voltage won't be 14.5v - it is going to be whatever the battery voltage is. The output voltage of the alternator won't rise to 14.5v until the battery voltage rises to 14.5v.

So it's not that the alternator puts out 14.5v and the battery only sees 14v because of voltage drop. That's not what actually happens. The voltage of the entire loop - alternator/wire/battery - rises together.

This is what I was saying about that false idea about voltage drop. I'm starting to think that perhaps the prevalence of voltage drop calculators on the net has something to do with this idea getting so widespread.
 

DiploStrat

Expedition Leader
We're Close

I think we are circling the old Charlie Sterling (Sterling Products - http://sterling-power.com) vs. Chris Gibson (Smart Gauge - http://www.smartgauge.co.uk) issue:

-- Sterling: "Most alternators only put out about 13v thus you need my regulator product to get the voltage high enough for a complete and fast charge."

-- Gibson: "That WAS true, but now most alternators put out over 14v, so there is no longer any need for an expensive secondary regulator, you just need an intelligent relay."

If the alternator set voltage is high enough, e.g., over 14v, then we are in violent agreement, any voltage drop (as long as the wires don't get too hot) will manifest itself simply as a longer charge time.

One of the "secrets" of the Sterling line of products is a separate sense wire (which, as it carries next to no current will have next to no voltage drop) which tells a Sterling Alternator to Battery or Battery to Battery charger to ramp up the voltage until the circuit voltage reaches the target voltage. In my case, since I don't use a Sterling A2B or B2B, my alternator can only sense from the load carrying wire, thus I agree, I could, in fact, go at least one gauge heavier.

Sterling and Gibson do agree that proper fast charging of modern batteries requires higher voltages and higher amperage.
 

CaliMobber

Adventurer
For fun throwing this out there.

My cs144 alternator charges at 15v when I first start the car in cool weather. It eventully lowers to 14.4 after about 30min of driving. I assume its because the engine bay finally warms up.

I also replaced the stock charge wire with 2 gauge a while back and did the big 3 grounding mod and ran a 2gauge wire all the way to the back of the car grounding it into the body every 3ft or so.


We totally stole this thread but oh well it was answered before we all started on our rants..... I have my coffee today :coffee:
 

DiploStrat

Expedition Leader
Sounds Right.

For fun throwing this out there.

My cs144 alternator charges at 15v when I first start the car in cool weather. It eventully lowers to 14.4 after about 30min of driving. I assume its because the engine bay finally warms up.

:coffee:

I would say that this is typical performance; my truck does the same, only the starting voltage is well above 15.5v. Lifeline, Trojan, and others publish charts which show the how the target charging voltage varies with battery type and temperature.

Actually, for historic reasons, I installed an intelligent relay controller with a high voltage cutoff which is limited to a maximum of 15v. At 10F my Lifelines want 15.5v but my relay opens at 15v. The Chevrolet happily zooms up to 15.5v and more. In the long run I will either add a force combine switch or replace the relay controller with a Blue Sea unit which has a high voltage cutout of 16.5v. Fortunately, I don't spend that much time at temps much below 20F.
 

CaliMobber

Adventurer
Yea here is the optima charging specs says 15.6v

Recommended charging information:

Alternator:
13.65 to 15.0 volts, no amperage limit.

Battery Charger:
13.8 to 15.0 volts, 10 amps maximum, approximately for 6-12 hours.

Cyclic Applications:
14.7 volts, no current limit as long as battery temperature remains below 125°F (51.7°C). When current falls below 1 amp, finish with 2 amp constant current for 1 hour.

Rapid Recharge:
Maximum voltage 15.6 volts (regulated), no current limit as long as battery temperature remains below 125°F (51.7°C). Charge until current drops below 1 amp.

Float Charge:
13.2 to 13.8 volts, 1 amp maximum current, time indefinite (at lower voltage).
 

dwh

Tail-End Charlie
Yea here is the optima charging specs says 15.6v

Recommended charging information:

Alternator:
13.65 to 15.0 volts, no amperage limit.

Battery Charger:
13.8 to 15.0 volts, 10 amps maximum, approximately for 6-12 hours.

Cyclic Applications:
14.7 volts, no current limit as long as battery temperature remains below 125°F (51.7°C). When current falls below 1 amp, finish with 2 amp constant current for 1 hour.

Rapid Recharge:
Maximum voltage 15.6 volts (regulated), no current limit as long as battery temperature remains below 125°F (51.7°C). Charge until current drops below 1 amp.

Float Charge:
13.2 to 13.8 volts, 1 amp maximum current, time indefinite (at lower voltage).

Yea, I like that Optima page. It's a perfect illustration that there is more than one way to skin a battery charging cat.
 

Forum statistics

Threads
189,234
Messages
2,914,601
Members
231,957
Latest member
lkretvix

Members online

Top