Where it is written the default is 10 bit resolution which is (0 - 1023)
I know that the ADC scale is 1024 divisions in default.. But I do not understand how does 10 bit = 1024 and 12 = 4096 divisions..
1 BIT of data can only hold 0 or 1 (two possible values)
2 bits can hold 00, 01, 10, or 11 (that's 4 possible values)
3 bits can hold 000, 001, 010, 011, 100, 101, 110, 111 (that's 8 possible values)
etc..
A byte of data has 8 bits, it can hold any one of 256 values
a 10 bit value is a binary value that has 10 binary bits. it can hold any one of 1024 different values. from 0 to binary 1111111111 (ie 0 to 1023 inclusive)
I need to scale a 4 - 20mA signal to a 0000 to 9999 display..
The reading should be very precise, as this will be the measurement of thickness in micron.
What will be the best analog bit resolution, and what AVR can I use ..??