I am hoping to setup a long term datalogger using a precision thermistor, and that made me dig into issues around calibrating gain & offset the arduino ADC.
I'd also like to read the Thermistor/resistor divider using the internal 1.1 vref to get another couple of effective bits, so that got me experimenting with the capacitor method of directly measuring the internal 1.1 bandgap voltage described at:
and another method that derives the bandgap from by matching your measured VCC to a calculated on, described over at Open energy monitor:
The problem I've been seeing is that the two methods give me different numbers:
say 1098mv from the capacitor method, and 1078mv from the CalVref.ino code.
When I work backwards from the a DVM (Extech ex330) measured line voltage myself, the CalVref bandgap seems let me derive Vcc more accurately, but I've been wondering if I've been doing something wrong with the cap. I read the cap when the unit is powered from the onboard regulator (an MCP1700) rather than USB, and the voltage on the cap, seems to increase slowly over time(?) and then stabilizes after about 5 minutes.
Has anyone else compared these two methods before? Which one do I trust to give me the real internal vref?