Using analog input on NodeMCU ESP8266

Hallo,

im not quite shure which model of the NodeMCU ESP8266 ive got, but my Adruino IDE works with model ESP-12E which is included in this link.
Its an black ESP chip on an black breakout board.
Im trying to use the analog input A0. Many websites say that it uses an voltage of about 1V for analog reference. So ive builded a voltage divider from 3.3V to 1.1V. When i connect 1.1V to analog input pin A0 i get values of about 321.
So i connected the 3.3V Pin (messured 2.28V) to A0 Pin and still got values less then 1000.

I used this simple sketch for testing:

void setup() {
  Serial.begin(9600);
  delay(1000);
  Serial.println(F("SETUP")); 
}

void loop() {
  Serial.println(analogRead(A0));
  delay(1000);
}

Any ideas what im doing wrong?

Thanks for your help!

Ive tried the same sketch as mentioned before with another black ESP8266 dev kit (CH340G wemos.cc LoLin new NodeMcu V3).
AnalogRead(A0) returns a value of 3 when A0 is connected to GND. Shouldnt it be 0???
Original NodeMCU Dev Kit returns values between 0 and 3 when A0 is connected to GND.
When i power on one of the dev kits there is always a new WLAN network, but my sketch does not start any network. Is that normal?
Does the WLAN network interfere with the analog readings?

Pin mapping:

original | LoLin Node Mcu V3

A0 | A0
RSV | G
RSV ->messured 0V | VU -> messured 4.97V
SD3 | S3
SD2 | S2
SD1 | S1

When i connect the VU pin of LoLin dev kit to A0 i get an value of 1024. Also when i connect Vin pin of orig. NodeMCU dev kit to A0 i get a value of 1024. So analog reference has to be greater 3,3V.
How can i change it???
analogReference(INTERNAL) does not work.

These dev kits have a voltage divider on the A0 pin.
Connecting 1V to the ADC pin of the ESP12 itself ouputs a value of 1023/1024.

I've literally just posted about this on the esp8266 forums. I've been banging my head about this for hours. Do you know how I can change the reference voltage?? I measured it with a multimeter and it's about 3.5V Also, where did you read about the voltage divider?

Cheers

It's in the schematic of the NodeMCU:

If you want a reference voltage higher than 3.3V, just add a resistor in series with the analog input:

V[sub]1023[/sub] = 1V * (100kΩ + 220kΩ + R) / 100kΩ

Where R is the resistor you put in series with A0.

V[sub]max[/sub] = 1.8V * (100kΩ + 220kΩ + R) / 100kΩ

If you need a voltage lower than 3.3V, you can just short out the 220kΩ resistor, then V1023 = 1V and Vmax = 1.8V.

Pieter

Thank you very much :slight_smile:

I'll continue in this thread with my follow up question:

I'm trying to get an lm35 (temperature sensor) to work with the analog pin on the nodemcu 1.0. I realise now that the voltage out (V_out) from the lm35 is not that arriving at the ADC pin on the esp, but rather V_adc = V_out * 100/320 (given the voltage divider).

So, if we have a V_out = 0.2V, the V_adc = 0.0625. We've basically scaled the V_out.

The LM35 datasheet says that the temperature changes by 1 deg C for each 10 mV. So, given that I've scaled the voltage, I'd have to scale the 10 mV/C to 10 * 100/320 mV/C, right? And the temperature reading should be V_adc * 1000 / (10 * 100 / 320)

I'm getting temperatures about 5 degrees higher than I would expect.. Here is my code for getting the temperature:

float getTemperature() 
{
  int val = analogRead(TEMP);
  Serial.print("Analog: ");
  Serial.println(val);
  
  float mv = (val / 1024.0);
  Serial.print("V at analog: ");
  float vAtAnalog = val / 1024.0 * 3.3;

  mv = vAtAnalog * 1000;

  float cel = mv / (10.0 * (10.0/32.0));
  
  return cel;
}

Now, when I measured the with a potentiometer, I found the voltage at which the ADC pin gives a reading of 1024 is at about 3.5V. I'm not sure if this is related?

Can anyone see a problem / possible see whether I have a mistake in my code above. Any help is appreciated.

VADC doesn't really matter in this case. Vsensor = 3200 mV * analogRead(A0) / 1023.

The 50mV difference you are getting could just be an ADC or sensor calibration problem. You could test some temperatures, import the data in Excel, do a linear regression, and add a calibration factor and offset to your program.

Pieter

Thank you Pieter. Makes sense. Regarding the ADC or sensor calibration, the sensor works good (I've tested on arduino and I've measured the voltage of the lm35 between v_out and gnd and everything works out about right). Given that, I imagine the ADC needs calibrating. How do I go about that?

As I type this, I thought that maybe I could plot the analogRead values against an output voltage (say using a potentiometer) and perform a linear regression, right?). I'll give that a shot and report back in a few hours.

Could you check the real value of resistors? In my case they were 197 and 97 KOhm, it explains all desviations.

I would ensure that you have the correct parameters. You can follow the correct tutorial on my blog:

NodeMCU Arduino Board Tutorial

If it's a v1.0 board, be sure to use the following parameters in the board options:

Also, ensure that you are using the A0 input pin.

makerportal:
I would ensure that you have the correct parameters. You can follow the correct tutorial on my blog:

NodeMCU Arduino Board Tutorial

If it's a v1.0 board, be sure to use the following parameters in the board options:

Also, ensure that you are using the A0 input pin.

most of the parameters on your screenshot were not in the menu in 2017

Same problem here for the deviation. The reading of analogRead is always a bit smaller, so need a volt slightly larger than 3.3V to get 1024. It’s okay to do a calibration for one ESP module. Problem is, if it’s really the ADC which needs calibration, the offset may be different for each chip, and it’s not possible to calibrate each chip in the production.

Bilz:
.... I could plot the analogRead values against an output voltage (say using a potentiometer) and perform a linear regression, right?). I'll give that a shot and report back in a few hours.

did you do the regression?
What calibration factors did you come up with?

PieterP:
VADC doesn't really matter in this case. Vsensor = 3200 mV * analogRead(A0) / 1023.

Hi Peiter,

where does the 3200mV value come from?

If the max value we get is 1024, why divide by 1023?

I too am battling to get sensible readings from an ESP8266 (I am using a TMP36 but seeing similar issues.

I measure the voltage on the TMP36 pins and see 702mV and I get a pin value of 226.

226/10243300mV gives me 728mV
if I use the 3200 value you used, it is of course better:
226/1024
3200mV gives me 706mV

I want to understand the 3200mV value. My ESP8266 supplies 3290mV to the TMP36.

From the schematic in reply #4:
1 V * (100 kΩ + 220 kΩ) / 100 kΩ = 3.2 V

The maximum value you can get is 1023 (0x3FF), not 1024. However, you're right, the divisor is debatable. For example, the ATmega328P datasheet tells you to divide by 1024. I don't know if there's anything in the ESP8266 datasheet about it.

thanks Pieter, and yes, I found a very long thread on the 1023/1024 question!

As just some addition numbers:-

HiLetgo brand version from Amazon, with an AI-THINKER device.
3.0 volts on A0 reads as 991 units.
That leads to a max. 1023 reading of 3.1V.

How can this be so variable? It doesn't make sense at all. If the ADC divider resistors are so crap, surely all the other bits would be too and the thing just wouldn't work. The Uno ADC is pretty reliable.

Confused. Again :confused: