Hi! I've some problem with voltage measurement..
I'm measuring the output voltage of the first amplifier (circuit attached) with a multimeter and with a voltmeter (code below). I'm looking for a 0.00V, which I think is ok by the fact that is an inverse output and I'm giving positive input voltage to the amplifier.
Using multimeter I find 0.0V; instead with the voltmeter I find 0.01 V....
This is a little difference that changes things a lot, because at the end of the circuit I want to observe millivolts variations.
Could you help me with that?
Thank you!
to abserve millivolts you would need to amplify the voltage or better using a 16bit external ADC.
Or you can use an external reduced reference voltage.
With standard internal reference voltage the resoultion of the Arduino-onboard ADC is 10 bit = 2^10 = 1023
Bear in mind that any digital system only resolves to +- the last digit.
If it reads “0.00” it could be 0.01 or -0.01.
Accuracy and repeatability are further issues.
In your code using float may give you the impression of higher resolution, but it doesn’t and you may get misleading results. ( you only have 1023 steps in the A/D output).