Agree to disagree.
If the battery is only getting 13.8v from the voltage drop then no more amp are flowing since the battery is also at 13.8v.So the volts are at a stalemate.
The battery isn't "getting" volts. It's "setting" volts. It's getting a flow of electrons through it, which build up in the chemistry and the voltage of the battery rises. The voltage is whatever is being allowed through, which is whatever the battery voltage is. The battery is regulating the voltage on the charging loop, and as long as that is below what the charger is set to, then the charger will keep charging.
So it would need a high voltage(more pressure) in order to push more amps.
No, it needs a higher voltage *potential* - and it does have that. But the voltage potential of the charger is not the same as the actual voltage of the charging loop.
This is why if you read the voltage across the terminals of the charger, you'll see 14.8v through your meter, even though the voltage on the charging loop is still being held down to 13.8v by the battery. Eventually the charging loop voltage will reach the limit that the charger is regulated to and the charger will regulate the voltage on the loop at 14.8v.
If you take a solar panel with a Vmp of say 18v, and hook it up directly with no charge controller to a battery that has a voltage of say 12.5v. What is voltage of the charging loop? 12.5v. The *optimum* operating voltage of the solar panel, to get "max power" is 18v, but it's actually only operating at 12.5v, which is far below its optimum. (Oh, the battery still charges even though the entire loop is at 12.5v - because the voltage *potential* of the solar panel is higher than the battery.) That's why MPPT gets more watts - it decouples the solar panel from the battery, creating two separate loops, and then the MPPT can regulate the voltage on the solar loop at whatever voltage will get the max power from the panel.
So your saying there is no such thing as voltage drop? if that was the case we could use any wire size we like...but its not.
That's exactly what I'm saying. Voltage drop - in this context - is theoretical. Wire size, distance and *amp load* through the wire will cause a certain effect. That effect is described as "voltage drop". But that effect is a sliding scale. It's not the same as the "voltage drop" through a diode, which is fixed.
At higher amps, there may be a high "theoretical" voltage drop - but it doesn't matter in terms of getting the battery to full voltage, because the battery is holding down the voltage on the charging loop to far below the charger's potential anyway, and as the battery voltage rises and the amps flowing goes down, the voltage drop eventually goes away.
Voltage drop in a charging loop won't cause the battery to end up at a lower ultimate voltage. What it will do, is make it take longer to get there.
if the voltage drops because a wire has to much resistance than the amps go up creating more friction heating up the lines. Thats why higher voltage is more efficient since the amps can drop.
Maybe you need to go back and read handybob charging page. he breaks it all down much better than i can.
https://handybobsolar.wordpress.com/the-rv-battery-charging-puzzle-2/
Ouch.
But actually, HandyBob says the same thing:
"
The best chargers can do a reasonable guess at state of charge by providing constant voltage and watching the amps taper as the battery fills to tell them when the battery is full."
"
You will find that voltage drop is directly proportional to the number of amps (higher amps equals more voltage drop)" [emphasis added - dwh]
But HandyBob does blow it in one way when he talks about using smaller wire between the solar and the charge controller, and bigger wire from the charge controller to the battery. He's using that same false idea I mentioned before and not recognizing that the whole bloody thing is just one big loop.
Well, with an MPPT controller, it's actually two loops, but there isn't enough amps difference between input and output to bother with using bigger wire on one side.
That same false idea is where the common wisdom comes from of putting the charge controller as close as possible to the battery. It doesn't matter. The charge controller is just a switch that either connects the solar to the battery or disconnects it from the battery. Doesn't matter where in the loop you put that switch.
Go to your favorite voltage drop calculator, and put in some wire size and distance, and then keep lowering the amp number. Watch as the voltage drop goes away. Here's the one at Calculator.net. I've already put in the relevant numbers - 14.8v, 15a, #14 wire, 20' loop:
http://www.calculator.net/voltage-d...ance=10&distanceunit=feet&eres=1&x=90&y=10
The result is:
Voltage drop: 0.051
Voltage drop percentage: 0.34%
Voltage at the end: 14.749
Now change the voltage to 10.5v, which is where the voltage would be of a fully dead battery, and the result is:
Voltage drop: 0.051
Voltage drop percentage: 0.49%
Voltage at the end: 10.449
So even using only #14 wire, you're going to have around a half a percent drop at 15 amps, and about a third of a percent drop at 1 amp. Which means that by the time the battery reaches full, there won't be enough voltage drop to matter. It doesn't matter what the drop would theoretically be at full load - because the loop voltage is being regulated by the battery and the charger is going to keep charging until it sees 14.8v on the loop.
Now change the wire size to #6. At 14.8v:
Voltage drop: 0.0079
Voltage drop percentage: 0.053%
Voltage at the end: 14.7921
At 10.5v:
Voltage drop: 0.0079
Voltage drop percentage: 0.075%
Voltage at the end: 10.4921
Sure, there's a lot less voltage drop with #6.
But there wasn't enough to matter even with #14. A 1/3 of a percent?
That is not "hugely" important.