My understanding is PWM is quickly chopping the voltage so that downstream it has effectively created a lowered voltage (usually in the 13.7-14.4 range depending on charge stage and controller) right?
It's a circuit, or in other words, a loop. So "downstream", the voltage of the loop is going to be whatever the battery voltage is. Which is what the voltage of the "upstream" side will be as well, since with a non-MPPT controller, the upstream and downstream are just one big loop whenever the switch is turned on.
The quick switching on and off of the PWM can limit the voltage of the circuit by throttling the
amount of current flowing through the circuit and it can limit the maximum voltage that the circuit rises to.
It can also do a very precise job of holding the circuit at a particular voltage, by switching on and giving a little jolt whenever the voltage of the circuit drops a bit.
But the voltage of the circuit is whatever it is. I.e., whatever the battery voltage is. The charger might *potentially* put out 14.4v, but it won't *actually* put out 14.4v until the circuit (and the battery) gets there. So, until the battery gets to that voltage, there is nothing for the PWM to limit, except the current flow. Once the circuit rises to 14.4v, then the PWM can limit the voltage from going any higher - by limiting the current flow.
"When a battery voltage reaches the regulation
setpoint, the PWM algorithm slowly reduces the
charging
current to avoid heating and gassing of
the battery, yet the charging continues to return
the maximum amount of energy to the battery
in the shortest time." [my emphasis added]
http://www.morningstarcorp.com/en/support/library/8. Why PWM1.pdf [Page 1]
And look at Secton 7 (Page 5)
"7.
Self-regulate for voltage drops
and temperature effects
With PWM
constant voltage charging, the critical
finishing charge will taper per the equation I = Ae-t.
This provides a self-regulating final charge that
follows the general shape of this equation.
As such,
external system factors such as voltage
drops in the system wires will not distort the
critical final charging stage.
The voltage drop
with tapered charging current will be small
fractions of a volt. In contrast, an on-off
regulator will turn on full current with the full
voltage drop throughout the recharging cycle
(one reason for the very poor charge efficiency
common to on-off regulators)." [my emphasis added]
Thus if you have a 5% voltage loss after the controller it will only be 95% of the controlled voltage at the battery.
Well...no. It's a loop. It's not like, "here's one end of the thing and here's the other end". It's all one loop. If you have 5% voltage drop then it'll be a 5% drop over the whole loop (circuit).
And that voltage drop will go away as the battery voltage rises and the PWM throttles back the
current to the battery, because voltage drop is a variable based on current flow.
If it is a slower switching and its based off what the battery voltage rises to then you're still limited to what the voltage at the controller is (unless there is an axillary battery sensing line) since the controller has no idea how much voltage drop there has been right?
Yes, correct. But it doesn't matter because the voltage at the controller is going to be the same as the battery, since the battery is setting (limiting) the voltage of the entire circuit. It's not like the controller end is 5% higher than the battery end; there is no "end" - it's a loop.
Now, let's say there's a 5% voltage drop. Well, it doesn't matter, the entire loop will be 5% low until the battery reaches a point where the current flowing through the loop reduces. And at that point, the voltage drop is also reducing. By the time the amp flow gets down to 1a, the voltage drop will down to something like a tenth (or maybe hundredth) of a percent.
Since it is being chopped down any extra voltage that makes it to the controller is just going to get wasted in the PWM chopping so might as well take the loss on that side and let the control set voltage be as accurate as possible at the battery.
Ahh...I think I just found the stumbling block.
There is no "extra voltage" (or extra current for that matter). The PV will make power only as long as there is somewhere for it to go. If the PWM were to cut say 50% of the power going to the battery, then the PV will be making 50% less power. The entire loop (PV, controller, battery) will have 50% less current flowing through it.
So, it's not like there is more power on one side of the charge controller than the other - as long as the switch is on, it's just one big loop. And when the switch is off, the whole thing is dead because no power can flow from the battery to the PV due to backfeed diodes blocking it.
Now, an MPPT charge controller is a little different - on the upstream side. It finds the maximum power point to put the most load on the PV and get the most watts out of it. The voltage on the upstream circuit ends up being whatever it has to be to get the most watts. Then it down converts that voltage to whatever it has to be to maximize the amperage, and then feeds that through a PWM to the downstream side, where it acts like any other PWM charge controller.