#define vs. pinMode

I am working on a softball scoreboard project. I created code on Tinkercad & it works fine. I am currently building a working miniature prototype. Following the advice of seasoned programmers, I am breaking the project into smaller steps. My code for Max runs per inning was giving me fits for days. "Max runs" is a single 7 segment display that receive input, up or down from an IR remote. I used #define to set the shift register pins. I could not get the display to work until I included pinMode declarations for the shift registers pins.

My question is why is pinMode necessary? I thought #define would be sufficient. I could not increment or decrement the display until I added pinMode. Serial.println showed the counters increasing or decreasing. Please straighten me out.

#define xLATCH 9 // Output Register Clock 12-595
#define xCLK 8  // Shift Register Clock 11-595
#define xDATA 10  // Input 14-595

void setup() {
  Serial.begin(9600);
  irrecv.enableIRIn();
  pinMode (xLATCH, OUTPUT);
  pinMode (xCLK, OUTPUT);
  pinMode (xDATA, OUTPUT);

case 0x5BFBFDE9:  //max score + 1, button R6B4 "Prev" 5BFBFDE9
        if (millis() - last > 150)
        delay(500);
          scoreXcounter = scoreXcounter + 1;
        tens = scoreXcounter / 10;
        ones = scoreXcounter - tens * 10;
        last = millis();

        digitalWrite(xLATCH, LOW);
        shiftOut(xDATA, xCLK, MSBFIRST, digitTable[tens]); // digitOne
        shiftOut(xDATA, xCLK, MSBFIRST, digitTable[ones]); // digitTwo
        digitalWrite (xLATCH, HIGH);

        Serial.println(results.value, HEX);
        Serial.println (scoreXcounter);
        break;

      case 0x38C34200:  //max score - 1, button R6B1 "Mute"  38C34200
        if (millis() - last > 50)
        delay(500);
          scoreXcounter = scoreXcounter - 1;
        tens = scoreXcounter / 10;
        ones = scoreXcounter - tens * 10;
        last = millis();

        digitalWrite(xLATCH, LOW);
        shiftOut(xDATA, xCLK, MSBFIRST, digitTable[tens]); // digitOne
        shiftOut(xDATA, xCLK, MSBFIRST, digitTable[ones]); // digitTwo
        digitalWrite (xLATCH, HIGH);

        Serial.println(results.value, HEX);
        Serial.println (scoreXcounter);
        break;
    }
    irrecv.resume();

My question is why is pinMode necessary?

Because that's the way the chip works. pinMode is setting up the hardware to be an input or an output. If you leave it an input and digitalWrite HIGH or LOW to it you are not writing to the pin you are turning the internal pullup resistor on and off.

All #define does is make a text based find and replace. The compiler never sees the defines. They aren't "sufficient" for anything except letting you let one piece of text stand for another piece of text. The pre-processor just finds all instances of the first string and replaces them with the second string before it goes to compiler.

A #define is a preprocessor directive. It is nothing more than a textual substitution in the code. So the directive:

#define xLATCH 9

does absolutely nothing in the program unless there is a C expression that has xLATCH in it. So, if you have the statement:

  • y = xLATCH;*

in your program, after the preprocessor pass, that statement looks like:

  • y = 9;*

On the other hand, pinMode() is a function call that establishes how a given pin in the program is to be used. Without pinMode(), how is the pin to be used? So, the statement:

  • pinMode (xLATCH, OUTPUT);*

after the preprocessor pass, but before the compiler pass, looks as though the statement was written:

  • pinMode (9, OUTPUT);*

#define allow you to create a symbol that contextually might be easier to understand its purpose than the number 9. Also, suppose there are 20 other expressions in your code that use xLATCH. If, for some reason, you are forced to move xLATCH from pin 9 to pin 10, simply change the #define for the new value 10, recompile/upload the program and you're done. No 19 additional, error-prone, code changes are needed to change the pin assignment. One final advantage is that symbolic constants are typeless. The compiler figures out its data type contextually.

Thanks gentlemen. I appreciate your replies.

Trouble with a macro (#define) is that it's just a "search and replace" and the compiler sees nothing of it. That's why errors can be very confusing for example. Luckily, in modern C++ we have an alternative called const. So

#define xLATCH 9 // Output Register Clock 12-595
#define xCLK 8  // Shift Register Clock 11-595
#define xDATA 10  // Input 14-595
//becomes
const byte LatchPin =  9; // Output Register Clock 12-595
const byte ClkPin =  8;  // Shift Register Clock 11-595
const byte DataPin = 10;  // Input 14-595

after also taking the liberty to give it more self-explaining names. :smiley: Now the compiler is in full control. It can throw proper error messages and do stuff like type checking. But with const it's aware it's not going to change during runtime so it will (probably) not store it in RAM. I say probably because if the compiler sees it can do other optimization by placing it in RAM it will. But that's rare.

Thanks septillion, great advice