I have set up a voltage divider across a 5volt supply. With a voltage input of 2.5v to an analog pin, the analogread value is 780 give or take 3 due to fluctuation. I would expect around 511 for a voltage input of 2.5v.
I need to measure accurately to one decimal point if possible, but it seems that the ADC in the AT chip is not very precise. Is this true?
I am having difficulty coming up with code to accurately measure voltage at an analog input. Any ideas? I am googled out.