I've been working on a battery charger on and off for a few months now. It's a slow process because every series of tests takes a week or more to be sure I got it right. Plus, I've been plagued by dis-information (mistaken people) all over the web.
But,...now I want to be able to measure voltage down to at least a tenth of a volt accurately and can't figure out how to do it. I borrowed a neighbors meter that is calibrated by the local power company every few months and it disagreed with every meter I had. Even an old Simpson. Turns out, his meter was wrong.
So, what can I use as a voltage standard to calibrate a cheap meter for this kind of thing? I'm willing to tweak a meter to get a good reading, but I can't figure out a good source to start with. Maybe a precision resistor; put current through it and do something based on that?
Things like 13.5V to float a AGM battery vs. 13.9 float on a flooded lead acid battery vs. 13.6V on a sealed lead acid.
I'm trying to overcome sulfation of batteries stored for a long time by keeping them on a float charger while not overcharging them and boiling away the water. Fine line that most of the commercial chargers don't walk too well. I plan on hooking one up with a little too much charge, measure the water loss over time and minimize that by lowering the float voltage until I reach a somewhat happy medium. Then, I'll seriously look into a spiked pulse to remove any residual sulfation I can. I'm really tired of replacing batteries on little used devices that have the batteries in places that are hard to get to. I'm talking about more than 10 of these things that get used a few times a year and seem to always have battery troubles when I get to use them.
Oh, and what do other folks do for medium term data logging? I want to record voltage and current changes in these things over a week under various conditions and I don't want to spend a fortune on equipment. I only have one laptop and tying it up doing something like that sucks
Of course most won't be as lucky to have picked one up very cheap at a surplus outlet many years ago.
There are many voltage reference ICs avalible that output a fixed voltage, 2.5vdc is a common value and are speced to certain guaranteed accuracy specs and only cost a few dollars. I've found that many digital DVMs are pretty accurate, but it's always good to have your own 'reference standard' on hand to compare unknown meters against. A .1vdc accuracy is not a difficult value to obtain with most modern DVMs.
Buy a good quality DVM and use it. I used to work as a Biomed Tech and we had a guy come by twice a year to check and calibrate our test equipment. Our Fluke meters never needed any adjustment and always passed. That is my experience. Don't try to use one with a weak battery and they perform great.
The meter I borrowed was a fluke, I don't remember the model, but it was off by almost a volt. He is a power company repairman and he uses it every day, with it being calibrated every few months by the company's meter division. Which makes me wonder about the power meter on the side of the house!
I checked out the voltage reference devices that Lefty suggested and they are pretty compelling. National has a number of them and I'm going to order a 3V device so I can feed it off a 5V supply. I could come up with a pretty good reference voltage using a 5V wall wart and one of these things to check a particular range on a meter. They have an accuracy of 0.1% which should get me within 3 millivolts of where I want to be.