Hello,
I have a simple setup to measure voltage:
Math:
R1 = 220 Ohm
R2 = 1000 Ohm
power supply = 3.3V
I = 3.3/(220 + 1000) = 2.7mA
V1 = I220 = 0.6
V2 = I1000 = 2.7
By using digital multimeter I do get the expected values:
V1= 0.6 and V2 = 2.7 and the circuit is 3.3
For coding I am using visual code with PlatformIO but I tested in sketch too with same results.
#include <Arduino.h>
int readPin = 3;
int readVal = 0;
float v2 = 0;
int delayTime = 500;
float vout = 3.3f;
void setup() {
// put your setup code here, to run once:
pinMode(readPin,INPUT);
Serial.begin(9600);
}
void printVars
(int n,char separator,...)
{
va_list l;
float val;
va_start(l,separator);
for (int i = 0;i < n;++i){
val = va_arg(l,double);
Serial.print(val);
Serial.print(separator);
}
Serial.println();
va_end(l);
}
void loop() {
// put your main code here, to run repeatedly:
readVal = analogRead(readPin);
v2 = ((float)readVal/1023.0f)*vout;
printVars(2,' ',(float)readVal,v2);
delay(delayTime);
}
The output:
542.00 1.75
542.00 1.75
...
661.00 2.13
661.00 2.13
...
1.75 is the reading when the analog read is pinned after R1 (before R2) - which should be 0.6 as per multimeter
2.13 is the reading when the analog read is pinned before R1 - which should be 2.7 as per multimeter
Why am I getting 1.75 and 2.13?
As I understand, by placing analog read between the 2 resistors it should read 2.7 (V2)
TLDR: (solution)
I was using 3.3V input source and I assumed that in arduino I should multiply the value read from analogRead() by 3.3 but it is 5V.
DEFAULT: the default analog reference of 5 volts (on 5V Arduino boards) or 3.3 volts (on 3.3V Arduino boards)