I want to measure between 11V - 14V using an ADS1115 4-channel ADC. In order to keep it's input voltage within limits, I'm dividing the SLA voltage by 5 using a potentiometer. Currently it's 500K. It isn't working very well - wth around 2V errors and inconsistently so, and I'd appreciate any help you can offer.
I'm using a 500K pot because this is across the supply all the time and it's a solar application and I want tiny current drain. I considered switching it in and out under program control but it might be very inaccurate. I started with a 2M pot but came down to 500K becuase of this problem. Is there a better way to do this?
The input impedance of this ADC is about 6M. Is the 500K pot causing a problem here? I can't see how to do the analysis.
I am using the ADS1115 in its default gain mode, where one count = 0.1875mV, so I'm converting the raw integer returned as Vout = ADCcount x 5 x 0.1875 / 1000 Volts. x5 is because the pot divides the real voltage by 5. Does that seem correct?
On the 'scope I'm seeing some noise on the ADC input pin, around 50KHz @ 0.2mV - which correlates with Serial output, but I think this cannot account for the problem I'm seeing.
With constant input voltage the ADC count is around 17000 +/- 3 for most of the time, but +60 about 20% of the time. I can't find the cause of this. It's not regular. Any ideas? Even so, I could tolerate that error - but not the 2V I'm seeing. And yes, I can fudge the code to get the right answer but that only works over a small range of voltages and is in any case, a cop-out.
Perhaps of relevance, is that my 'scope probe, when applied to the ADC pin, drops the measured voltage which the ADC reports dramatically. I assume my (old, analog) scope has a high input impedance, and I suppose this is telling me something, but I'm not sure what.