Make SURE, SURE, SURE, you use the absolutely fattest electrical wiring you can get away with. DC sucks power, the longer the wire run, as there is something about the electrons running through the wire resistance, and next to the return wires that causes them to lose amperage at many times the rate of AC wiring.
Not to be THAT guy but here's how that works.
Where E is voltage, I is current, and R is resistance.
You can get any one by taking it out of the equation.
Voltage divided by resistance of load will tell you how much current you are drawing.
And so on:
Voltage drop divided by current draw will tell you the resistance of whatever In our case the wire run.
Current drawn times resistance will tell you how much voltage drop you have.
Passing current through a resister (in this case thin wire) will create a drop in voltage, it has no choice, it has to.
We'll use an example to make the principle clear:
Given: a 10 amp load (charging or drawing)
Given: a wire run of 20 feet
Variable: Wire gauge
Note: I didn't talk about voltage or AC vs DC, I don't have to. Current and resistance are the
only factors in determining voltage drop.
Using this chart
http://www.seas.gwu.edu/~ecelabs/appnotes/PDF/techdat/swc.pdf we know the resistances of various wire gauges per 1000ft. Divide that by 50 and we have the resistance of our 20ft run.
20awg wire the avg resistance will be approximately 10 ohms/1000ft so our 20ft run is 0.2 ohms
0.2 ohms times our 10 amp load is 2 volts. That is the voltage drop of the wire.
If your battery/alternator supplies 13.6v and you put a 10 amp load at the end of a 20 foot run of 20awg wire it will drop 2V for the wire and you will see 11.6v at the load and 20 watts of power will be dissipated as heat by the wire itself. That is a 15% loss!
If your wall outlet supplies 125v and you put a 10 amp load at the end of a 20 foot run of 20awg wire it will still drop 2V for the wire and you will see 123v at the load and 20 watts of power will STILL be dissipated as heat by the wire itself, but that is only a 1.5% loss.
You could also infer from that example that with 30awg with a resistance of approximately 100 ohms/1000ft so our 20ft run is 2 ohms and the voltage drop at 10 amps would be 20v meaning that long before you the 10 amp load got powered all voltage would be dissipated as heat approximately 200 watts worth... With 125v it would drop you to 105 volts and it may still work but remember you are still heating that wire. I'm pretty sure the magic smoke would get out at that point.
But we are interested in going the other way. Moving to 10awg
drops the resistance to 1 ohms/1000ft so our 20ft run is 0.02 ohms and and the voltage drop at 13.6v would be .2v meaning that your voltage at the load would be 13.4 and the wire would be dissipating as heat approximately 2 watts worth... That's a good setup.
That's the rub, voltage loss as a percentage of supplied voltage with a side order of how much heat do I want the wire making.
This is the same reason 48V DC is popular for inverters and solar setups. Higher voltage, lower current for the same wattage means less power wasted pumping juice and more power getting to the destination. There is also a factor for 48VDC being easier/more efficient to convert to 120VAC, but that is for another discussion.
It's the same rules that govern high tension high voltage power lines for electrical transmission. The power company uses transformers to step up the generated voltages to a higher voltage (115 kV to 765 kV AC, varying by the transmission system and by country) for transmission over long distances. Higher voltage = less current for the same wattage and less loss over the same wire.
Unfortunately in a vehicle application your voltage is your voltage whatever you do you have to size the wire to your application's needs.