I have constructed a simple voltmeter using a voltage divider supplying a voltage (0-5V) at pin A0. The actual input to the voltage divider can swing between 0-15V.

It works well and seems to accurately measure the voltage from my bench supply.

I want to use this application to measure the AVC (automatic volume control) voltage from a vintage tube radio. As you tune in a station the AVC voltage changes. The stronger the station, the lower the AVC voltage.

Once a station is tuned in the voltage stabilizes well, as displayed by my digital multimeter.

However, the voltage that the arduino displays in the comm window swings +/- 40%. For instance, if the actual voltage from the radio is 6V (as read by the DMM), the Arduino comm window display starts out at about 3V, increases over the course of about 5 seconds, tops out at about 8V, then swings back downward until it hits the minimum (3V), then repeats. It does this over and over.

I find that if I add as much a 5 uF from the A0 input to ground, it stabilizes, reads the right voltage, barely swings at all. Why do I need to add the capacitor?

Thanks!

Mark

/*
*

*/

float vPow = 5.05;
float r1 = 98200;
float r2 = 54600;

void setup() {
Serial.begin(9600);

delay(200);
}

void loop() {

float v = (r * vPow) / 1024.0;
float v2 = v / (r2 / (r1 + r2));

Serial.println(v2);

delay(50);
}

The A/D is quite fast.
If there is any noise on the AGC line, you will see this reflected in your sketch.
A capacitor on the A0 line is reasonable.

You could also look into averaging multiple readings before you display the averaged value.

.