35 watt solar blvd panel

dwh

Tail-End Charlie
Agreed, without an external sensing line or sensor, the controller isn't able to monitor the voltage loss on the wiring between the controller and the battery. It's best to have that side as short as possible.

That applies to constant current chargers. Doesn't apply to constant voltage chargers.
 

Crom

Expo this, expo that, exp
(Yes, part of that last post was a lie. I'm waiting to see who can call me on it. :D )

What I do, is while charging I put my multimeter leads on the pos. and neg. battery posts--whatever voltage I see is what it's charging at. That's how I know the voltage of the charging loop. As the battery recharges the voltage will begin to rise. If I want to know how much current the battery is receiving then I reposition my multimeter positive lead (to measure current), rotate the dial to measure current, and place the multimeter in series. :)
 

4x4junkie

Explorer
I guess wherever your "lie" is must be throwing me way off, because I don't understand at all what it is you're trying to explain with the meter and a short-circuit.

I see it 100% the other way... Having the lowest voltage drop completely applies if you want the Absorb charge cycle (a CV charge mode) to complete as quick as possible. This is because the controller it measuring the battery's voltage within the controller itself, not at the battery. There is no way for it to know what the battery itself's voltage is unless maybe it has some way to switch off the current for a split-second to then measure the battery's voltage before resuming at a reduced current if needed.

If the line has let say, 0.5 V drop, the charger will drop out of Bulk mode with the battery having only reached 13.9 volts or so, which then increases the Absorb cycle time due to the V drop causing reduced current flow (and that is only if the Absorb mode's timer doesn't time out beforehand leaving the battery in a state of less than full charge).

A constant-current charger OTOH (such as during bulk mode) initially doesn't care about a voltage drop on the line. The voltage out of the charger would simply rise slightly to accommodate the V drop until the charger reaches it's point of current-limiting again (a NiMH/NiCd trickle charger would be another example of a CC charger). However on a 3-stage charger, it compensating like that for a V drop will "fool" it into going to Absorb mode too soon.

Besides, all of this is moot anyway regardless of who's right. A 3-stage charger uses both CC and CV modes anyway
 
Last edited:

pods8

Explorer
"Extra voltage" the PV (edit, dang auto correct) side has the potential of 17+ usually seems the logical place to lose volts since that extra potential isn't needed on the battery side.

Also that voltage never goes away, just the current is started/stopped, if its getting sun then the voltage is there.
 
Last edited:

dwh

Tail-End Charlie
What I do, is while charging I put my multimeter leads on the pos. and neg. battery posts--whatever voltage I see is what it's charging at. That's how I know the voltage of the charging loop. As the battery recharges the voltage will begin to rise. If I want to know how much current the battery is receiving then I reposition my multimeter positive lead (to measure current), rotate the dial to measure current, and place the multimeter in series. :)

Yup, you got it. The voltage at the battery terminals is what the voltage of the charging circuit is. :)

Measuring at the controller end is the short circuit that is misleading.
 

bat

Explorer
I think real world testing is in the future as soon as I get my new controller in the mail. I looked at a solar setup yesterday in my friends van that was not working and leaving for vacation 120 watts on a 4.5 amp Sunguard controller :Wow1: so I installed my controller. This is great info being said and thanks for posting and helping others.
 

pods8

Explorer
I guess the real question is are basic PWM controllers taking a voltage reading off the charge loop itself? I would tend to think they're just seeming the voltage across the controller short circuit as mentioned (however if they took a reading parallel with the returning negative wire that should provide a real voltage of the loop, not sure if they do that or not).
 

dwh

Tail-End Charlie
I guess wherever your "lie" is must be throwing me way off, because I don't understand at all what it is you're trying to explain with the meter and a short-circuit.

It's all about "potential". Say the charge controller is a non-MPPT. The PV has a Vmp of say, 17v. The battery is low so the controller keeps the PV connected to the battery (the majority of the time - i.e., high duty cycle). So, the *potential* voltage at the controller end is say...17v. If you measure across the terminals, you might actually see that much (depends on the actual duty cycle at the time you take the measurement).

It's a "short" circuit because you are bypassing the charging circuit loop when you take the measurement. It doesn't have to be a "dead short" to be a "short circuit"- it just has to be shorter than the other loop. So, you're likely to see the full "potential" voltage across those terminals (through your meter), instead of the "actual" voltage of the charging loop.

Taking the reading at the other end will show you what the actual voltage of the charging loop is, because you are not bypassing the charging loop like you are when you take the reading at the controller end.


I see it 100% the other way... Having the lowest voltage drop completely applies if you want the Absorb charge cycle (a CV charge mode) to complete as quick as possible. This is because the controller it measuring the battery's voltage within the controller itself, not at the battery. There is no way for it to know what the battery itself's voltage is unless maybe it has some way to switch off the current for a split-second to then measure the battery's voltage before resuming at a reduced current if needed.

There again is that idea of, "there's this end, and there's that end". There isn't. There is no end - it's a loop.

So the controller isn't "measuring the battery's voltage from the controller end". It's "measuring the voltage of the charging loop." It knows exactly what the battery's voltage is, because the battery is what is controlling (limiting) the voltage of the charging loop.

But yea, voltage drop will cause it to take longer to get to 14.4v.


If the line has let say, 0.5 V drop, the charger will drop out of Bulk mode with the battery having only reached 13.9 volts or so,

It shouldn't drop out of bulk until the loop (and the battery, since the battery controls the voltage of the loop) reaches 14.4v.


which then increases the Absorb cycle time due to the V drop causing reduced current flow (and that is only if the Absorb mode's timer doesn't time out beforehand leaving the battery in a state of less than full charge).

By the time the charger drops to absorb (CV mode), there's already less voltage drop, because the battery has already reached 14.4v. And, once it's in CV mode, as the current tapers off, so does the voltage drop, until there isn't enough voltage drop to matter.

But again; yea, it can cause it to take longer to finish. But it's not going to be nearly as significant of an effect as it would be for bulk stage - because the battery already has a surface charge. Voltage drop might cause it to take longer to reach bulk surface charge voltage, but it's not going to have much of an effect on absorb.

(And I think the majority of multi-stage chargers don't have a timer on absorb - they decide to drop to float when the current flow drops to whatever...1a for some chargers, 2a for others, 3a for a few others. The Iota IQ/4 control module does have a timer on the absorb stage.)


A constant-current charger OTOH (such as during bulk mode) initially doesn't care about a voltage drop on the line.

Neither does a CV charger.


The voltage out of the charger would simply rise slightly to accommodate the V drop until the charger reaches it's point of current-limiting again

It would rise, but not because it's trying to overcome voltage drop that it doesn't know about. It's just cranking up the potential to try and reach the current limit. It'll get there, or as close as it can, and keep doing that until the charging loop reaches 14.4v.


Besides, all of this is moot anyway regardless of who's right. A 3-stage charger uses both CC and CV modes anyway

Not always. A PWM solar charger such as a SunSaver is a 3-stage charger - but it's only CV - it has no real CC mode (other than running wide open at whatever the PV's potential is). A shore powered 3-stage would usually have CC mode. (And a good DC-DC charger as well.)
 

dwh

Tail-End Charlie
"Extra voltage" the PV (edit, dang auto correct) side has the potential of 17+ usually seems the logical place to lose volts since that extra potential isn't needed on the battery side.

Also that voltage never goes away, just the current is started/stopped, if its getting sun then the voltage is there.

If there's no current, there's no voltage. There has to be a complete circuit for a solar cell to work. That's why PV, unlike a wind generator, doesn't need a dump load.
 

dwh

Tail-End Charlie
I guess the real question is are basic PWM controllers taking a voltage reading off the charge loop itself? I would tend to think they're just seeming the voltage across the controller short circuit as mentioned (however if they took a reading parallel with the returning negative wire that should provide a real voltage of the loop, not sure if they do that or not).

You got it.

But we do know for sure that they are reading the current of the loop, so I would think engineers at places like Morningstar would want to read the correct voltage as well.

Lowbuck engineers? Well...who knows what they are doing.
 

pods8

Explorer
If there's no current, there's no voltage. There has to be a complete circuit for a solar cell to work. That's why PV, unlike a wind generator, doesn't need a dump load.

Potential voltage is what I was talking.

On the panel side I don't really see how the voltage can really lower at the times current flows assuming all cells have sun on them. The panel is going to put out 17V regardless, when anything creates a draw. That is where the PWM going on/off creates a lowered effective voltage after the controller but the incoming feed is going to be something like 17V (minus losses) on/off/on/off/on/off rather than the voltage from the panel being pulled down.
 

dwh

Tail-End Charlie
Potential voltage is what I was talking.

On the panel side I don't really see how the voltage can really lower at the times current flows assuming all cells have sun on them. The panel is going to put out 17V regardless, when anything creates a draw. That is where the PWM going on/off creates a lowered effective voltage after the controller but the incoming feed is going to be something like 17V (minus losses) on/off/on/off/on/off rather than the voltage from the panel being pulled down.

The voltage of the PV will be whatever the battery voltage is - not the rated Vmp of 17v (or whatever) because when the PV is connected to the battery, it becomes part of the charging loop. So now the loop isn't battery-wire-controller, it's battery-wire-controller-wire-PV. And the voltage of the charging loop is controlled by the battery. (With a non-MPPT controller. With an MPPT controller, there are two separate loops or circuts.)

Once the battery voltage has risen to the set point, then the PWM regulates the max voltage of the circuit by cutting out (reducing the duty cycle). Otherwise, the PV which has a [whatever, say 17v] potential will keep pushing the voltage of the charging circuit up, which will overcharge the battery. But until the battery gets to the set point, IT (the battery) is regulating the voltage of the loop.


Because it's all one big loop (circuit), it doesn't matter, with a non-MPPT controller, where in the loop the controller is (to return to an earlier question). With an MPPT controller, then yea - it matters; put the charge controller as close to the battery as possible to minimize the charging time delay (pointed out by 4x4J) that results from voltage drop.

Also, with MPPT, since the *amperage* on the charge circuit will (usually) be higher than the amperage on the PV side, but the voltage on the charge side will be lower than the voltage on the supply side - then you generally need larger wire between the battery-controller than you need between the controller-PV. With a non-MPPT controller, you can use the same size wire on both sides. Having larger wire on the battery side doesn't gain you anything unless you use larger wire on the other side of the controller as well.


Here's an article that explains:

http://www.blueskyenergyinc.com/uploads/pdf/Practical_Sailor.pdf


"A typical 80-watt panel has an
operating voltage of 17.8 volts that
delivers 4.5 amps (watts = volts x
amps) in industry standard conditions.
This is the Maximum Power
Point (MPP) for this panel. However,
when the solar panel is connected
to a battery bank—either through
a regulator or directly—the panel’s
operating voltage is pulled down
to
something near the voltage of the
battery bank
, yet the charging current—
4.5 amps—remains the same.

Under these conditions, 60 watts
(12.2 volts x 4.5 amps)

[Or...whatever the battery voltage happens to be, times whatever the current happens to be - he's just using those numbers as an example. - dwh]

is the maximum
charging power the batteries
will ever receive from the 80-watt
solar panel. If the battery voltage is
lower, even less power will be available
to charge the battery bank."


That's what happens with a non-MPPT controller. The next paragraph explains what the MPPT does:

"A Maximum Power Point Tracking
regulator finds and tracks the
MPP for the prevailing conditions,
which in turn determines the best
operating voltage for the panel
. Then,
while the optimum panel voltage is
maintained,
a voltage converter lowers
the output voltage before sending
it on to the batteries. The charging
voltage, after conversion, would vary
with battery condition
, but for our
purposes we’ll use 12.6 volts."
 

4x4junkie

Explorer
It's all about "potential". Say the charge controller is a non-MPPT. The PV has a Vmp of say, 17v. The battery is low so the controller keeps the PV connected to the battery (the majority of the time - i.e., high duty cycle). So, the *potential* voltage at the controller end is say...17v. If you measure across the terminals, you might actually see that much (depends on the actual duty cycle at the time you take the measurement).

It's a "short" circuit because you are bypassing the charging circuit loop when you take the measurement. It doesn't have to be a "dead short" to be a "short circuit"- it just has to be shorter than the other loop. So, you're likely to see the full "potential" voltage across those terminals (through your meter), instead of the "actual" voltage of the charging loop.

Taking the reading at the other end will show you what the actual voltage of the charging loop is, because you are not bypassing the charging loop like you are when you take the reading at the controller end.




There again is that idea of, "there's this end, and there's that end". There isn't. There is no end - it's a loop.

So the controller isn't "measuring the battery's voltage from the controller end". It's "measuring the voltage of the charging loop." It knows exactly what the battery's voltage is, because the battery is what is controlling (limiting) the voltage of the charging loop.

But yea, voltage drop will cause it to take longer to get to 14.4v.




It shouldn't drop out of bulk until the loop (and the battery, since the battery controls the voltage of the loop) reaches 14.4v.




By the time the charger drops to absorb (CV mode), there's already less voltage drop, because the battery has already reached 14.4v. And, once it's in CV mode, as the current tapers off, so does the voltage drop, until there isn't enough voltage drop to matter.

But again; yea, it can cause it to take longer to finish. But it's not going to be nearly as significant of an effect as it would be for bulk stage - because the battery already has a surface charge. Voltage drop might cause it to take longer to reach bulk surface charge voltage, but it's not going to have much of an effect on absorb.

(And I think the majority of multi-stage chargers don't have a timer on absorb - they decide to drop to float when the current flow drops to whatever...1a for some chargers, 2a for others, 3a for a few others. The Iota IQ/4 control module does have a timer on the absorb stage.)




Neither does a CV charger.




It would rise, but not because it's trying to overcome voltage drop that it doesn't know about. It's just cranking up the potential to try and reach the current limit. It'll get there, or as close as it can, and keep doing that until the charging loop reaches 14.4v.




Not always. A PWM solar charger such as a SunSaver is a 3-stage charger - but it's only CV - it has no real CC mode (other than running wide open at whatever the PV's potential is). A shore powered 3-stage would usually have CC mode. (And a good DC-DC charger as well.)
You need to explain it a little bit better then how the charger is able to detect the voltage at the battery end of the same line that it is charging through when there is a voltage drop across it, because all of my experience in this type of stuff has been the exact 100% complete opposite what you are saying.
This stuff about a "loop" makes no sense whatsoever. Yes, current does flow in a loop (usually correctly referred to as a circuit). But at different points of said circuit, the voltage varies (current is what stays the same at any given point in a (series) circuit, not voltage).
You keep mentioning a PWM charger is somehow different than other types, yet the part of the circuit between the controller and the battery is the same no matter what. The point at which the charger detects the voltage is obviously within the unit itself if there's no remote sensor at the battery, so again, that 0.5V drop that is occurring on the wire going out to the battery is outside of the charger's ability to detect it.
Even your quote from the article says:
the panel’s
operating voltage is pulled down to
something near the voltage of the
battery bank
. (emphasis changed)
Obviously it's "something near the voltage of the battery bank" instead of "that of the battery bank" because any copper wire has a voltage drop.

A 3-stage charger operates as a CC-mode charger during Bulk mode (again, type is irrelevant, a PWM unit simply supplies the maximum current the panels can deliver, any voltage short of 14.4V is not regulated). The charger also continuously monitors the voltage it sees at it's output terminals. Upon it rising to 14.4V, it then switches to Absorb mode and holds a fixed 14.4V (CV) for the duration of the Absorb stage. But yet at this point the battery has only reached 13.9V when this switch to Absorb takes place (the controller only sees the 14.4V at it's output terminals). Now the charger's Absorb stage has to do the work which the (much faster) Bulk stage could've otherwise done to bring the battery up that last 0.5V (and will take forever doing so because the current drops off as the battery's voltage continues to rise. By the time it's reached 14.2V, there will be less than half the amount of current flowing as when the Absorb stage first started).

Manuals for every charge controller I've ever worked with (Inverter/Chargers too) have all clearly stated an importance to have a minimum of resistance between the controller and the battery to maximize it's charging performance (use of sufficient size wire and/or locating it as close to the battery as practical). The manual for the SunSaver 10 / 10L (what I assume is the OP's unit) suggests to have it physically located within 10 feet or less of the batteries.



My SunSaver MPPT has an Absorb timeout (3 hours), as does the OPs unit too (3-4 hours). I know many Xantrex units have Absorb timeouts too (1 hour on a FSW3000 inv/chg for example). Purpose of a timeout is so that a parasitic draw (a fridge running for example or a light left on) doesn't cause a unit that uses current-sensing to stay stuck in Absorb mode and it continually overcharging your batteries at 14.4V needlessly.
 

Crom

Expo this, expo that, exp
I'll try to word it another way. :)

You need to explain it a little bit better then how the charger is able to detect the voltage at the battery end of the same line that it is charging through when there is a voltage drop across it, because all of my experience in this type of stuff has been the exact 100% complete opposite what you are saying.

There is no battery "end.", it's a closed circuit. :D

This stuff about a "loop" makes no sense whatsoever. Yes, current does flow in a loop (usually correctly referred to as a circuit).

Makes perfect sense to me. Circuit is closed, current will flow. Circuit is open, current will stop.

But at different points of said circuit, the voltage varies (current is what stays the same at any given point in a (series) circuit, not voltage).

I don't think so. Let me give an example. If you had 50' of copper conductor between controller and battery, there will be some voltage drop, but if you meter the circuit at the controller you'll see the exact same voltage if you metered at the battery terminals. The battery sets the voltage. The controller doesn't care how long your run of wire is, it's going to do its job and charge the battery, it's just going to do it less efficiently and take longer, the difference is shorter runs of cable will have less resistance. I=V/R. Also it may be helpful to state here that voltage drop is not a static condition, it's based on a mathematical formula, and as the battery accepts current and begins to recharge (battery voltage goes up), and the amount of voltage drop will eventually become extremely small, i.e. (tenth, 1/100 of a volt). So it really doesn't matter that much, it's all about efficiency.

That's my understanding in a nutshell. Hope that helps.
 

bat

Explorer
If you had 50' of copper conductor between controller and battery, there will be some voltage drop, but if you meter the circuit at the controller you'll see the exact same voltage if you metered at the battery terminals. The battery sets the voltage. The controller doesn't care how long your run of wire is, it's going to do its job and charge the battery, it's just going to do it less efficiently and take longer, the difference is shorter runs of cable will have less resistance. I=V/R.
If the voltage and amps is the same why would it take longer for the battery charge.
 

Forum statistics

Threads
189,927
Messages
2,922,321
Members
233,083
Latest member
Off Road Vagabond
Top