|
![](/profile/get-photo.asp?memberid=5055&type=profile&rnd=447)
| Let's say you have a 500@ amp-hour 12 volt battery and a 1000 amp-hour 12 volt battery on the same vehicle for some reason. Let's further say that they are both essentially dead but you jump-started your engine and now you're letting the alternator get after it on a long trip. You don't plan on shutting the engine down for 4 hours at least.
My thought process goes like this:
Your alternator doesn't care about amperage. It cares about voltage because it has a built in voltage regulator. So it's going to put out 13.8 volts all the time. Now the small battery doesn't have as much capacity as the large battery so it gets charged in half the time that it takes to charge the large battery. However, once it's charged, it doesn't accept any more charging (because the voltage difference between it and the alternator is 0) and the larger battery only gets charged from that point forward.
I've read enough on NAT to know that a lot smarter people than me say that's not how it works and I believe them. Can someone go through the math to tell me why though? If it can be explained without equations, that's fine too, I just want to understand. Thanks. | |
|