Hopeless Diamond
Explorer
So, it's not a critical issue, more of a question on what is accurate and how to measure it.
I have 3rd Gen 4runner with the stock alternator. I added a GM diode to the exciter circuit to raise the output voltage a little bit. I've added a "house" AGM through a solenoid and wanted to try and give a little better chance at getting a better charge. I run an android head unit with Torque connected to the OBD2 and read the system voltage from there (the diode did raise it by ~.4 to .5v, now ranging from 14.1 to 14.6 depending on temp, loads, etc.).
I also have a Midland Dual Band Ham radio, connected directly to the starting battery. It has a read out for voltage, and it has always read about .3 to .4v lower than the OBD2. (13.9 to 14.1) It seems to be in a tighter range but then I can't watch it as easily as the OBD2 readings.
I then changed out the cigar lighter with a USB dual power port, that has a voltage display. This reads lower than the OBD2 by .5 to .6v, and the radio by about .1 to .2v.
All this got my curiosity up and I got out my VOM. It comes back with readings different than all the others, typically in the middle. This is at the battery posts (both batteries read the same when the solenoid is closed, seems my wiring works right).
End of the day it's not a big deal, and as long as they read it consistently, that's fine. The OBD2 reading seems high, and the power port and radio readings seem low to me. As I expand on the house side system (nothing on it yet), I do want to have the ability to check voltages when it's isolated for fridges, night time lighting, device charging, etc., figure I'll end up with yet another reading. It's rare that we stay at one location for very long, so theoretically that system would recharge during travel, but even then with short trips and enough loads it may not get topped off completely and then could run into problems. Knowing a baseline reading would help. What the display says is less important than what it means, assuming it's consistent (it may read 11.9v, but in reality it is actually 12.5v). Short of $$ for a calibrated Voltage meter, how do you accurately read a "volt"?
I have 3rd Gen 4runner with the stock alternator. I added a GM diode to the exciter circuit to raise the output voltage a little bit. I've added a "house" AGM through a solenoid and wanted to try and give a little better chance at getting a better charge. I run an android head unit with Torque connected to the OBD2 and read the system voltage from there (the diode did raise it by ~.4 to .5v, now ranging from 14.1 to 14.6 depending on temp, loads, etc.).
I also have a Midland Dual Band Ham radio, connected directly to the starting battery. It has a read out for voltage, and it has always read about .3 to .4v lower than the OBD2. (13.9 to 14.1) It seems to be in a tighter range but then I can't watch it as easily as the OBD2 readings.
I then changed out the cigar lighter with a USB dual power port, that has a voltage display. This reads lower than the OBD2 by .5 to .6v, and the radio by about .1 to .2v.
All this got my curiosity up and I got out my VOM. It comes back with readings different than all the others, typically in the middle. This is at the battery posts (both batteries read the same when the solenoid is closed, seems my wiring works right).
End of the day it's not a big deal, and as long as they read it consistently, that's fine. The OBD2 reading seems high, and the power port and radio readings seem low to me. As I expand on the house side system (nothing on it yet), I do want to have the ability to check voltages when it's isolated for fridges, night time lighting, device charging, etc., figure I'll end up with yet another reading. It's rare that we stay at one location for very long, so theoretically that system would recharge during travel, but even then with short trips and enough loads it may not get topped off completely and then could run into problems. Knowing a baseline reading would help. What the display says is less important than what it means, assuming it's consistent (it may read 11.9v, but in reality it is actually 12.5v). Short of $$ for a calibrated Voltage meter, how do you accurately read a "volt"?