Following on from this thread I've been trying to get my head around converting the result from analogRead to a voltage.
int value = analogRead (A0);
And imagine we get the value of 1000.
Now (ignoring the fact that integers don't have decimal places) is the voltage (assuming we have a 5V reference):
1000.0 / 1024 * 5.0 = 4.8828125 or: 1000.0 / 1023 * 5.0 = 4.8875855
In other words, divide by 1023 or 1024? I should point out that dividing by 1024 immediately gives a "wrong" answer for measuring 5V:
1023.0 / 1024 * 5.0 = 4.9951171
This is because the maximum reading is 1023.
Browsing the web:
To scale the numbers between 0.0 and 5.0, divide 5.0 by 1023.0 and multiply that by sensorValue
One vote for 1023.
Multiple schools of thought, including:
mrburnette: You divide by 1024 if you are a mathematician or if you truly understand ADC successive approximation.
Hedging his bets:
The number 0 represents 0V and 1023 represents Aref. The voltage level at any ADC port can be determined as follows:
float analogValue = (float)digitalValue * (3.3f/ 1023f)
In theory, you should divide by 1024, because there are that many steps.
The ADC returns a value from 0 to 1023 not 0 to 1024. 0 to 1023 is 1024 different values because zero is a value too. ... That is why you divide by 1023 and not 1024.
Not exactly. Consider the nominal "width" of the ADC reading: it is 1024. That is strictly related to Vref (typically 3.3V). Now we must slice that segment in many smaller parts: in total are 1024 "tiles" (if you like numbered from 0 to 1023).
[quote author=Coding Badly date=1404267824 link=msg=1789197] The divisor (1023.0) in that tutorial is wrong. The correct value is 1024.0...
float voltage= sensorValue * (5.0 / 1024.0);
The maximum value returned by the analog-to-digital converter (1023) is correct. [/quote]
The correct scaling math; *x/n or *x/1024 correctly scales all the output data in size, but gives an average rounding error of 0.5 ADC counts (always rounding down).
So, how come we get a "wrong" result for a value of 1023?
It seems that the ADC can return 1024 "slots" of values (from 0 to 1023) where each slot represents 1/1024 of the reference voltage.
So for a 5V reference voltage the slot width is:
5 / 1024 = 0.0048828125V
Thus a result of 0 could be: 0V to 0.00488V A result of 1 could be: 0.00488V to 0.009766V ... A result of 1023 could be: 4.99511V to 5V
So really the "wrong" result for 5V is more that the voltage is not necessarily 5V, it could be 4.995.
In addition to that, the hardware rounds down the result, so you could compensate by adding in an average error of 0.5.
1023.5 / 1024 * 5.0 = 4.99756V
So this is reasonably close to 5V. Similarly a reading of 0 should probably be treated as possibly halfway between 0 and 1, thus it could on average be:
0.5 / 1024 * 5.0 = 0.002441V
Oh yes, and the Atmega328 datasheet mentions in the section "23.7 ADC Conversion Result" that the figure is 1024.