I've been having a little trouble trying to figure out how the code below turns A0 to an analog input. Please help... The main part of my concern is "#define ANALOG0 0x01" Thank you!
bhengr: #define ANALOG_PORT PORTF // Analog #define ANALOG_DDR DDRF #define ANALOG0 0x01 // Pin 0 Input
How does setting ANALOG_DDR&=~ANALOG0; reference to ANALOG 0?
On the Arduino MEGA, analog 0 (the first bit of the A2D converter) happens to be bit 0 of PORTF, but:
that won't be true on most other Arduino-like devices
the author of this code is using "external knowledge" (looking at the schematic) to figure this out, rather than deriving it from any other data, like the definition of A0...