35 watt solar blvd panel

4x4junkie

Explorer
If the voltage and amps is the same why would it take longer for the battery charge.

In an ideal world it wouldn't. But we are not in an ideal world.

People need to get out a meter (an accurate one) and start taking some actual hard readings on their equipment.
100% I guarantee anyone will find a higher voltage at the output terminals of a charger (ANY charger) than on the terminals of the battery said charger is connected to.
If the wiring is of good size, the voltage difference should be small (0.2V or less). If the wiring is feeble/too lengthy, could even be a whole volt or more.

It's simple physics guys. All copper wiring has a voltage loss.
 

dwh

Tail-End Charlie
You need to explain it a little bit better then how the charger is able to detect the voltage at the battery end of the same line that it is charging through when there is a voltage drop across it, because all of my experience in this type of stuff has been the exact 100% complete opposite what you are saying. This stuff about a "loop" makes no sense whatsoever.

Ouch. Well, okay...but remember - you asked for it. If it hurts, don't blame me. :)

http://en.wikipedia.org/wiki/Kirchhoff's_circuit_laws#Kirchhoff.27s_voltage_law_.28KVL.29

In particular, this:

http://en.wikipedia.org/wiki/Kirchhoff's_circuit_laws#Generalization


Yes, current does flow in a loop (usually correctly referred to as a circuit). But at different points of said circuit, the voltage varies (current is what stays the same at any given point in a (series) circuit, not voltage).

The *measurement* varies at different points, because you are measuring different sections of the loop, rather than the entire loop.


You keep mentioning a PWM charger is somehow different than other types,

No, I keep mentioning that PWM is different than MPPT. An old style solar "regulator" is just a single loop (circuit) when the switch is closed. Battery-wire-controller(switch)-wire-PV. One big loop. A PWM is the same.

An MPPT is different, because it has two loops; battery-controller, and controller-PV, with a converter or transformer in between.


yet the part of the circuit between the controller and the battery is the same no matter what. The point at which the charger detects the voltage is obviously within the unit itself if there's no remote sensor at the battery, so again, that 0.5V drop that is occurring on the wire going out to the battery is outside of the charger's ability to detect it.

There is no voltage drop on the outgoing wire, there is voltage drop on the whole loop. If there's no loop, then there's no current, and no voltage drop. The charge controller is part of the loop.


Even your quote from the article says:
. (emphasis changed)
Obviously it's "something near the voltage of the battery bank" instead of "that of the battery bank" because any copper wire has a voltage drop.

Yea, well he didn't specify what he was referring to. You assumed he was referring to voltage drop. Myself, I assumed he was referring to the higher potential of the PV, since it has to have a higher potential than the battery or nothing will flow through the circuit.


A 3-stage charger operates as a CC-mode charger during Bulk mode (again, type is irrelevant, a PWM unit simply supplies the maximum current the panels can deliver, any voltage short of 14.4V is not regulated).

PV through a PWM is a CV charger, since it's not trying to crank up the supply potential to overcome resistance to crank the amp flow to the current limit. It just lets the amps fall where they may - which is a CV charger.


The charger also continuously monitors the voltage it sees at it's output terminals.

The terminals are both ends of the loop. It might be monitoring that, or it might be monitoring the voltage at its internal shunt.


Upon it rising to 14.4V, it then switches to Absorb mode and holds a fixed 14.4V (CV) for the duration of the Absorb stage. But yet at this point the battery has only reached 13.9V when this switch to Absorb takes place (the controller only sees the 14.4V at it's output terminals).

The controller sees the voltage of both ends of the loop. The loop won't be at 14.4v until the battery is at 14.4v.

Also, that number of 14.4v for the absorb is off. Most 3-stage chargers default to 14.2v for the absorb stage.


Now the charger's Absorb stage has to do the work which the (much faster) Bulk stage could've otherwise done to bring the battery up that last 0.5V (and will take forever doing so because the current drops off as the battery's voltage continues to rise. By the time it's reached 14.2V, there will be less than half the amount of current flowing as when the Absorb stage first started).

Not exactly. Once the battery hits 14.4v, it's not fully charged. That's a surface charge. The absorb stage is necessary to "simmer" the chemistry for a while until it reaches full saturation - without overheating it, which is what would happen if you cranked constant current until it hit saturation.


Manuals for every charge controller I've ever worked with (Inverter/Chargers too) have all clearly stated an importance to have a minimum of resistance between the controller and the battery to maximize it's charging performance (use of sufficient size wire and/or locating it as close to the battery as practical). The manual for the SunSaver 10 / 10L (what I assume is the OP's unit) suggests to have it physically located within 10 feet or less of the batteries.

I'm looking at the manual for the Gen3 SunSaver:

http://www.morningstarcorp.com/en/support/library/SS3.IOM.Operators_Manual.01.EN.pdf

I must be missing something, because I don't see that.

I do note that on the wire gauge chart, on the line for 10 amps (what the SS-10 can do max), the shortest one-way distance they list in the chart is 14' with #14 wire. The longest at that amperage is 91' with #6.

Neither one is within 10'.



My SunSaver MPPT has an Absorb timeout (3 hours), as does the OPs unit too (3-4 hours). I know many Xantrex units have Absorb timeouts too (1 hour on a FSW3000 inv/chg for example). Purpose of a timeout is so that a parasitic draw (a fridge running for example or a light left on) doesn't cause a unit that uses current-sensing to stay stuck in Absorb mode and it continually overcharging your batteries at 14.4V needlessly.

Yea, those are default settings. Very conservative.

If you look at the recommendations for Deka batteries, they specify that the absorb not exceed 12 hours (Page 1):

http://www.dekabatteries.com/assets/base/1913.pdf


If you look at this doc from Energy1 batteries about how to properly determine and configure the absorb time, they say:

"Finally, relying on a charger’s preset is going to result in battery failure."

http://www.energy1batteries.com/Tech Papers/Absorption Charge Characteristics.pdf
 

dwh

Tail-End Charlie
If the voltage and amps is the same why would it take longer for the battery charge.

They voltage and amps on the charging circuit itself (the loop) don't stay the same - it's a sliding scale.

When charging with a constant voltage charging system, voltage increases over time, and amperage decreases. Too small of wire will slow that down due to higher resistance (translation: voltage drop).

At first - when the voltage is low and the amperage is high.

But as the voltage rises and the amperage decreases the resistance of the wire (voltage drop) also decreases, until it has basically gone away.
 

4x4junkie

Explorer
Nope, not hurt one bit at all. There is not one thing there that discredits anything I have said regarding the voltage being higher at the charger terminals than at the load (battery), or that copper wiring has a voltage drop.

Instead it appears you may have misread that part of the page (which may have been oversimplified), and is where I think you've been thrown off. Read the part above right where that section begins (Kirchhoff's Voltage Law).

Kirchhoff's Law simply states that the total sum of voltages in a circuit (loop) is zero (total drops = EMF). If we have a PV panel creating 17 volts like you used for your example, and the battery is at 14.0 volts, the wire from panel to battery has a 1.5V potential (loss or drop, in layman's terms), the wire back to the panel also has 1.5V. The sum of all that would be 0:

1.5 + 14.0 + 1.5V - 17.0 = 0

But now lets look at what happens when a PWM charge controller is inserted into the loop as you say (lets say that the controller is located midway between the panel and the battery):

Now we have broken the 2 wires into 2 sections each, each with a 0.75V drop (2 wires on the panel to controller side, 2 from controller to battery). So now the loop goes like:

0.75 + 0.75 + 14.0 + 0.75 + 0.75 - 17.0 = 0.

Remembering that the controller is only capable of measuring voltage at the point of the circuit that is it's output terminal, it sees the 14.0V of the battery, plus 0.75V + 0.75V = 15.5V. That 15.5V quite obviously would make the controller curtail it's current output very quickly.
So now that the controller has cut the output down to 14.4V (entered Absorb mode), the controller has now "broken" the loop between the battery and the PV panel. Now we have 2 separate "loops" not unlike like you say there would be on a MPPT unit. This means the controller has now limited the EMF to 14.4V within the "loop" that is the controller and the battery.
So now we have the following:

0.2V + 14.0V + 0.2V - 14.4V = 0

Lets say the system has a 10-amp capacity and that each wire has 0.075 ohms resistance.
0.075 ohms at 0.2V (×2 wires) will only allow 2.666 amps to flow to the battery (0.2V ÷ 0.075Ω = 2.666~A). This means the system at this point is operating at barely over ¼ capacity while trying to bring that battery up to the full 14.4V that otherwise would've been the Bulk stage's job (and only gets to be less & less as the battery's voltage rises. In theory, the battery can never actually reach 14.4V).

With such quantity of resistance in the wiring, a 10-amp system controller would actually quit the Bulk stage with the battery at 12.9V. (0.75V + 12.9V + 0.75V - 14.4V = 0)(10A × 0.075Ω = 0.75V). I certainly would call that a pretty significant degradation in charging performance, if you ask me...


So lets shorten up the wiring and get that resistance down to something more reasonable, like 0.005 ohms (about what 5 feet of #10 AWG would be). At 10 amps, 0.005 ohms has a potential of 0.05V.

Now we're looking at something more like the battery reaching 14.3V before the controller decides to quit Bulk:

0.05V + 14.30V + 0.05V - 14.4V = 0

That's certainly a hell of a lot better than it quitting with the battery at 12.9V...
The Absorb stage now can do what it's supposed to: allow the battery to absorb energy and bring it to full charge in the quickest time possible without excessive gassing.


Now... Can we put this issue to rest? :)



I must be missing something, because I don't see that.

Page 19, Step 1. ;)
 

dwh

Tail-End Charlie
But now lets look at what happens when a PWM charge controller is inserted into the loop as you say (lets say that the controller is located midway between the panel and the battery):

Now we have broken the 2 wires into 2 sections each, each with a 0.75V drop (2 wires on the panel to controller side, 2 from controller to battery). So now the loop goes like:

0.75 + 0.75 + 14.0 + 0.75 + 0.75 - 17.0 = 0.

So here, you are using *one loop* battery-PV as your basis.


So now that the controller has cut the output down to 14.4V (entered Absorb mode), the controller has now "broken" the loop between the battery and the PV panel. Now we have 2 separate "loops" not unlike like you say there would be on a MPPT unit. This means the controller has now limited the EMF to 14.4V within the "loop" that is the controller and the battery.
So now we have the following:

0.2V + 14.0V + 0.2V - 14.4V = 0

And here, your are using *two loops* for your basis.

However, there is *no such thing*, because, as you just said, "the controller has now "broken" the loop between the battery and the PV panel."

If the loop has been broken (circuit opened) then there is no voltage, no amperage, and no voltage drop because the supply/source/potential has been disconnected. Nothing flows.

Any voltage drop calcs after that point don't mean anything.


(Though technically, there is still a loop because the controller is still drawing from the battery to power itself. Only now, the source is the battery, not the PV, and the voltage drop calc for the unit drawing 8 milliamps idling from the battery is a different animal than the voltage drop calc when charging.)


Now... Can we put this issue to rest? :D

I'm fine with that. I've sure had enough of it.


Page 19, Step 1. ;)

Ah yes, now I see it. But - it has nothing to do with voltage drop.

"The unit should be located in the same ambient temperature as the battery. Locate the controller within 10 ft (3 M) of the battery bank."

It has to do with temperature. They recommend it within 10' because it does temperature compensation, but not based on the battery's temp. It does it based on the local ambient temperature and doesn't have a provision for a remote temp sensor, so it needs to be near enough to the battery to have approximately the same ambient temperature.


So, agaiin - it doesn't matter where in the line you put it.

(But now that I think about it - gluing it to the back of a hot solar panel is probably not such a good idea.)
 

Crom

Expo this, expo that, exp
In an ideal world it wouldn't. But we are not in an ideal world.

People need to get out a meter (an accurate one) and start taking some actual hard readings on their equipment.
100% I guarantee anyone will find a higher voltage at the output terminals of a charger (ANY charger) than on the terminals of the battery said charger is connected to.
If the wiring is of good size, the voltage difference should be small (0.2V or less). If the wiring is feeble/too lengthy, could even be a whole volt or more.

It's simple physics guys. All copper wiring has a voltage loss.

Not to labor the point but I've done this exercise before. I took a 96AH battery that had a 12.50 V charge, and applied a constant voltage charger (13.65V) to it and immediately metered the output terminals of the charger. I did not see 13.65 volts, instead I saw something more like like 12.65V, I even metered the amps and it showed 0.9A output at this voltage. I knew that it was charging and went away for 10 hours. Upon returning I metered it again and saw 13.56V, I did not meter the amps but I suspect it was like the chart below and outputting less current. Anyways, I knew the battery had reached that level of charge after 10 hours on the battery maintainer. :)

I'm not as adept as dwh is at explaining this as it's not my primary field. But I like to think I do understand a little bit. I think it's just like this chart:

charging-characteristics-b.jpg
 
Last edited:

4x4junkie

Explorer
.

However, there is *no such thing*, because, as you just said, "the controller has now "broken" the loop between the battery and the PV panel."

If the loop has been broken (circuit opened) then there is no voltage, no amperage, and no voltage drop because the supply/source/potential has been disconnected. Nothing flows.

Any voltage drop calcs after that point don't mean anything.

I think I see what the issue is, you take things out of the context in which it was said.

I said it "breaks" the loop...
I was referring to "It breaks the one single loop into TWO loops". (notice the use of quotes around the word Broken) Each loop is now a separate closed circuit that is a loop in and of itself, so current no doubt still flows through them.

If you were reading my post, it seems this should've been fairly obvious from the numerous examples given, but by your comment I'm assuming you must've stopped reading at that point?


Not to labor the point but I've done this exercise before. I took a 96AH battery that had a 12.50 V charge, and applied a constant voltage charger (13.65V) to it and immediately metered the output terminals of the charger. I did not see 13.65 volts, instead I saw something more like like 12.65V, I even metered the amps and it showed 0.9A output at this voltage. I knew that it was charging and went away for 10 hours. Upon returning I metered it again and saw 13.56V, I did not meter the amps but I suspect it was like the chart below and outputting less current. Anyways, I knew the battery had reached that level of charge after 10 hours on the battery maintainer. :)

I'm not as adept as dwh is at explaining this as it's not my primary field. But I like to think I do understand a little bit. I think it's just like this chart:

http://www.yuasaeurope.com/images/uploads/uk/images/charts/SWL/charging-characteristics-b.jpg

That's completely normal there. :)

The battery was drawing enough current initially that it actually caused the charger's voltage to be pulled lower (reached it's maximum current output capability). As the battery attains a charge, the battery's voltage rises, and the charger's voltage rises with it (eventually hitting it's voltage set point, shown right about at the 8-hour mark on your chart). By the time you came back, the battery had taken enough charge that the voltage was nearly equal to what the charger's set point is, and the current flow greatly reduced. Your chart supports this as well.

But did you meter the voltage at the charger's terminals, AND at the same time at the battery terminals while it was charging and note any difference between them?

Edit:
Nevermind... You did say it was a "battery maintainer" which you probably don't have access to the output terminals directly unless you were to take the unit's case apart.
But to do so on a charge controller such as the OP's, you would see the voltage some amount of millivolts higher at the controller's output than at the battery.


.
 
Last edited:

dwh

Tail-End Charlie
I said it "breaks" the loop...
I was referring to "It breaks the one single loop into TWO loops". (notice the use of quotes around the word Broken) Each loop is now a separate closed circuit that is a loop in and of itself, so current no doubt still flows through them.

If it was still flowing through the PV at that point, then that would be a short on the PV side (Isc).

So are you saying that when the battery is full, the controller disconnects the PV from the battery and leaves the PV sitting there shorted back to itself until the sun goes down?
 

4x4junkie

Explorer
lol
Nope, not even close to what I said.

PV-to-controller is one loop, controller-to-battery is the other loop.
During Absorb, the controller takes just enough energy from the PV loop and passes it over to the battery-side loop in order to maintain 14.4V (as measured at it's output terminals) on the battery side loop.
During this time you will see a voltage much higher on the PV side than the 14.4V that is on the battery side.

A quick check with a voltmeter an hour or so after Absorb has started would very quickly & easily confirm this.
 

dwh

Tail-End Charlie
He's measuring the PV side.

Watch when he flips the switch - when the PV is connected to the PWM controller, the PV voltage is at battery voltage (appx. 12v). When he flips it to MPPT, then the PV voltage is at whatever the maximum power point is (appx. 19v).

When he flips it to PWM, the PV is at battery voltage because it's just one big loop. When he flips it to MPPT, the PV voltage is different from the battery voltage, because with MPPT it's two loops.



If the PV side (the input side of the PWM controller) is at 12.(something)v, then how in hell could there be a higher voltage than that on the output side of the controller without some sort of voltage up-converter in the middle - which PWM controllers DON'T have.

The only way you can read a higher voltage would be by measuring a short circuit.
 
Last edited:

bat

Explorer
I did 2 simple test
1- Panel to controller (2ft away) and 25ft of 12 guage tinned wire to battery
2- Panel to 25ft of 12 guage tinned wire to controller and 2ft of wire to battery

Regardless how I did it all the all measurements of voltage was the same between the panel and controller and controller and battery.
 

4x4junkie

Explorer
He's measuring the PV side.

Watch when he flips the switch - when the PV is connected to the PWM controller, the PV voltage is at battery voltage (appx. 12v). When he flips it to MPPT, then the PV voltage is at whatever the maximum power point is (appx. 19v).

When he flips it to PWM, the PV is at battery voltage because it's just one big loop. When he flips it to MPPT, the PV voltage is different from the battery voltage, because with MPPT it's two loops.



If the PV side (the input side of the PWM controller) is at 12.(something)v, then how in hell could there be a higher voltage than that on the output side of the controller without some sort of voltage up-converter in the middle - which PWM controllers DON'T have.

The only way you can read a higher voltage would be by measuring a short circuit.

Um.... Hello? We were talking about what happens during ABSORB mode. There was not one mention at all of what stage the controller was operating in.
Obviously it was not in Absorb mode, it was in Bulk (operating as one loop)


I did 2 simple test
1- Panel to controller (2ft away) and 25ft of 12 guage tinned wire to battery
2- Panel to 25ft of 12 guage tinned wire to controller and 2ft of wire to battery

Regardless how I did it all the all measurements of voltage was the same between the panel and controller and controller and battery.

You measured the voltage at the panel, again at the controller and again at the battery and it read the same at all points? Was the panel in full sun?
 

bat

Explorer
You measured the voltage at the panel, again at the controller and again at the battery and it read the same at all points

I was measuring at each end to see if adding 25ft of wire in different points caused voltage drop in my setup which it did not. The panel voltage was 20 volts to the controller and from the controller to the battery was 12.50. Having the controller closer to the battery did not make a difference in my reading. The tech from Morningstar I emailed said to check the voltage from the controller to the battery at each end and if the voltage was lower to shorten my wire and if it was the same leave it.
 

4x4junkie

Explorer
I was measuring at each end to see if adding 25ft of wire in different points caused voltage drop in my setup which it did not. The panel voltage was 20 volts to the controller and from the controller to the battery was 12.50. Having the controller closer to the battery did not make a difference in my reading. The tech from Morningstar I emailed said to check the voltage from the controller to the battery at each end and if the voltage was lower to shorten my wire and if it was the same leave it.

12.50V at the battery sounds to me like maybe it wasn't charging, which would explain having the same voltage at each end of the battery line (no current flow).

Was the controller's Charge Status light on? (very briefly blinking off about every 5 seconds)? Or was it opposite? (off, but blinking on briefly every 5 sec)? What was the battery status indication? (steady green I imagine?)
 
Last edited:

dwh

Tail-End Charlie
Um.... Hello? We were talking about what happens during ABSORB mode. There was not one mention at all of what stage the controller was operating in.
Obviously it was not in Absorb mode, it was in Bulk (operating as one loop)

PWMs always operate as one loop. That's all they *can* do. The only difference between bulk and absorb is what voltage they cut out at.
 

Forum statistics

Threads
189,096
Messages
2,912,910
Members
231,750
Latest member
travelall74
Top