I've been beating my head against this for a day or two, and I am stumped.
I am trying to measure car battery voltage.
I have a simple voltage divider with 47K and 20K resistors. I have a 1.5uF cap to ground as well to filter noise (I have a lot of noise at 1.5Khz and 3Khz). This is connected to an A/D pin.
With 12.7V input, I should be getting a reading of:
1024 * (12.7 / 5) * 20 / (47 + 20) = 776.
I'm consistently getting readings of about 713.
I've checked everything I can think of. I checked Vcc; it's 5.03V.
I have a simple voltage divider with 47K and 20K resistors.
So what tolerance are these resistors?
If each one was at the wrong end your divider could be off. I haven't worked it out because I don't know what tolerance you are using, but it feels like this is the problem.
Crap. Thanks for pointing out the obvious. I would have sworn I got 1% but I just checked the data sheet and they're 5%. Still, it seems to be a lot.
I also have a 5V zener to ground paralleling the cap. Could that make a difference? The leakage is on the order of 50nA, so it should be orders of magnitude less than the .2mA across the resistors.
I also have a 5V zener to ground paralleling the cap. Could that make a difference?
Yes it could. The knee on zenners is not so sharp so you could be getting a lot more current down it than you think.
Leakage current is often for the reverse bias case, are you sure that it is so small for forward bias.