Accurate voltage measurement

Hi,

I’ve found myself in possession of some 50 Ah / 100 volt liPo battery packs.

Whilst I consider all the exciting ways of injuring myself and setting fire to my workshop, it occurred to me that it would be useful to be able to accurately and consistently measure the battery voltage. Many of my projects include a simple voltage divider circuit connected to an analogue input pin to log voltage, but my guess is that this is unlikely to be as consistent or accurate as I’d like.

What’s the best way of accurately measuring voltage?

Thanks

I would just stick one of the cheapo multimeters on it, nice big display and in its own case.

An alternative would be one of these - http://www.ebay.co.uk/itm/DC-3-Wire-LED-Digital-Display-Panel-Volt-Meter-Voltage-Voltmeter-Car-Motor-BF-/252414547360?var=&hash=item3ac51455a0:m:mye17Bb78eFustYpRZKZXyg

(deleted)

I would just stick one of the cheapo multimeters on it, nice big display and in its own case.

Sorry if I didn't make myself clear, I'm looking for a way to accurately measure it using an arduino - so then when I do come up with interesting project I can monitor it's voltage in real time.

I already have a mid-range multi meter, but have no way I knowing if it's accurate.

why would it be inconsistent or inaccurate?

I would expect the resistors in the voltage-divider circuit to change slightly with time and temperature - this would effect consistency. Similarly, I can only accurately calculate voltage with this method if I know (accurately) the exact resistance of the divider circuit.

(deleted)

How much accuracy do you need? 1% resistors are common and 0.1% resistors are available.

I would expect the resistors in the voltage-divider circuit to change slightly with time and temperature - this would effect consistency. Similarly, I can only accurately calculate voltage with this method if I know (accurately) the exact resistance of the divider circuit.

Resistors don't change significantly over time or with "normal" temperature variations.

The Arduino voltage reference is likely to be the biggest source of drift.

I already have a mid-range multi meter, but have no way I knowing if it's accurate.

The accuracy should be stated in the specs. Your homemade meter will probably be less-accurate, and of course, you'll have to use your multmeter to check the calibration of your homemade meter and you can't get any mare accurate than your "calibration standard".

Meters used in lab or production environments are periodically calibrated by an independent lab with [u]traceable[/u] calibration standards, and it would be calibrated to within the manufacturer's specifications. Of course, most hobbyist's meters are not calibrated and a calibration lab wouldn't want to calibrate your homemade meter.

Think we are missing something , how are you defining accurate if a mid range multimeter is not acceptable to you ?

The typical way of checking a multimeter is against a 5 or 10 volt reference diode or special reference device, many around.

Checking for 100v accuracy is somewhat harder.

The regulator on an arduino is probably only +/- 5% and I don't know it's temperature coefficient .

You're 0...1023 ADC reading will be scaled by that.

  • so you'd need to measure this accurately , together with your resistor chain to obtain an accurate reading.

I think the internal 1.1V reference is better - but again you'd need to measure that.

How accurate do you want to get? - with all the above a cheap 3 1/2 digit meter is likely to be better.

regards

Allan

Think we are missing something , how are you defining accurate if a mid range multimeter is not acceptable to you ?

or to turn the question on it's head, what makes you think a mid-range multi meter is accurate?

I've often had this problem when dealing with sensors, I can calibrate a proximity detector using a ruler with a reasonable degree of confidence that it's corect, I can calibrate a thermometer against the boiling / freezing point of water and a pressure sensor against a column of mercury, but how can I really be sure my anemometer/ampmeter/voltmeter is working accurately?

OhMyCod: or to turn the question on it's head, what makes you think a mid-range multi meter is accurate?

I've often had this problem when dealing with sensors, I can calibrate a proximity detector using a ruler with a reasonable degree of confidence that it's corect, I can calibrate a thermometer against the boiling / freezing point of water and a pressure sensor against a column of mercury, but how can I really be sure my anemometer/ampmeter/voltmeter is working accurately?

There is a forum especially for folk like you called Metrology ( meant nicely) :)

Think you can get all your such questions followed up there, though expect some of those members will also be in this forum, though of course you will probably be able to persue all the details better in there. http://www.eevblog.com/forum/metrology/?PHPSESSID=5d2608b2a0f72dfee7364cfe6b09da89

Hi,

50 Ah / 100 volt liPo battery packs.

wow, can you post a picture of a pack please?

If you are worried about voltage accuracy, and it really worries you, then buy a FLUKE meter.

They are calibrated fresh from the factory, then put it away and only use it to check the calibrations of your other DMMs.

I do instrument calibrations and find the top end DMMs hold their calibration for more than 10years, even the ones that are bounced around a car boot or thrown around a factory floor.

And if you are really, really worried about cal, then send your DMM to a calibration centre and get it certified. (Personally buying a high end DMM is cheaper)

I suspect you want high accuracy because of the charge-discharge levels of Lipo, you don't need 1/100 of a volt accuracy.

Tom... :)

You should measure cell voltage, not stack voltage. Cell voltage is far more important in a 100volt LiPo battery. There are special ballance charger/monitor chips that can do this. A DIY "flying capacitor" circuit might also do what you want. Leo..

Meters used in lab or production environments are periodically calibrated by an independent lab with traceable calibration standards, and it would be calibrated to within the manufacturer's specifications.

I know - I used to have a partner who worked in one, this why I'm so aware of the inaccuracy of most instruments the general public use. Next time you drive your car look at the difference between how fast your speedometer says you're going what your GPS says, or try pumping up your tyres on a garage forecourt and then measure the pressure again at the next garage (if you're very bored!) - if they're with 20% of each other I'd be surprised.

wow, can you post a picture of a pack please?

No. But i can post the datasheet! I've acquired 3 of these (all slightly damaged)http://www.front-electric-sustainer.com/Manuals/FES%20BATTERY%20PACK%20GEN1%20manual%20v1.13.pdf

You should measure cell voltage, not stack voltage.

First things first! Yes, I've been doing some reading since getting my new toys, but I'll start by trying to monitor the overall stack voltage

Hi,
Nice,pity it didn’t come with the BMS.
batts1.jpg
batts2.jpg
Tom… :slight_smile: