[solved] Serial works in loop() but not in setup()

Hello,

My device is a Wemos D1 mini. I'm new to arduino in general. Please bear with me.

For debugging I wanted to be able to use Serial anywhere, anytime but also have a global flag to disable debugging.

So I wrote it like this:

#define DEBUG 1 // 1 enable 0 disable

#if DEBUG == 1

  void serial_init() {
    if(!Serial) {
      Serial.begin(115200);

      while(!Serial) {
          delay(100);
      }

      delay(100); // just in case

      Serial.println("serial console initialized!");
    }
  }
  

#  define DEBUG_PRINT(val) serial_init(); Serial.print(val);
#  define DEBUG_PRINTF(val, format) serial_init(); Serial.print(val, format);
#  define DEBUG_PRINTLN(val) serial_init; Serial.println(val);

#else

#  define DEBUG_PRINT(val) {}
#  define DEBUG_PRINTF(val, format) {}
#  define DEBUG_PRINTLN(val) {}

#endif // DEBUG

so the initialization of serial console happens in a subroutine serial_init().

Now for some reason this works fine in loop() but if used in setup() the messages don't appear at all.

void setup() {
    DEBUG_PRINTLN("setup() start");
    delay(1000);
    DEBUG_PRINTLN("setup() end");
}

void loop() {
    DEBUG_PRINTLN("loop() start");
    delay(1000);
    DEBUG_PRINTLN("loop() end");
}

The result of this code is

serial console initialized!
loop() start
loop() end
loop() start
loop() end

So it's like the setup() part is completely ignored. And I just don't understand why.

Once I pull the subroutine into the setup() directly, it works.

void setup() {
     if(!Serial) {
      Serial.begin(115200);

      while(!Serial) {
          delay(100);
      }

      delay(100); // just in case

      Serial.println("serial console initialized!");
    }

    DEBUG_PRINTLN("setup() is alive!");
}

So I can rewrite it to fix it but I would like to understand the problem so I don't run into mysterious errors in the future.

Why does Seria.begin() only work in setup() directly and not in a subroutine called by setup()?

OK, sorry for bothering you, I'm an idiot.

I had a typo

#  define DEBUG_PRINTLN(val) serial_init; Serial.println(val);

() missing

so serial did not print until initialized in a different place

edit:

and now I'm enabling compiler warnings

warning: statement is a reference, not call, to function 'serial_init' [-Waddress]
 #  define DEBUG_PRINTLN(val) serial_init; Serial.println(val);
                                         ^

man, so useful. warnings defaulted to none for me

thanks all

Yeah, it's a bit unfortunate that "None" is the Arduino IDE's default setting for warnings.

Why are you calling serial_init() for every single debug print. That seems like it would be very annoying.

Here's my debug system:

#define DEBUG false  //set to true for debug output, false for no debug output
#define DEBUG_SERIAL if(DEBUG)Serial

When DEBUG is set to false, the compiler will optimize the Serial statements out of the code because it knows they will never run.

If you have a board with native USB and want the sketch to hang until Serial Monitor is opened so you don't miss any output, you can add this in setup():

#if DEBUG == true
  while (!Serial) {}
#endif  // DEBUG == true

That's not needed for your D1 Mini though.

Yeah, it’s a bit unfortunate that “None” is the Arduino IDE’s default setting for warnings.

It even was back to None for me the next day… apparently I did not close the window correctly or something, it did not save the config changes. This is the problem when learning new things, I just don’t know what problems to expect. With time comes experience I guess.

Why are you calling serial_init() for every single debug print. That seems like it would be very annoying.

I had issues with serial outputs not appearing, and I never thought of checking for such syntax errors, so I thought maybe deep sleep disabled the serial console or something, so … I wanted a macro that works no questions asked by forcing the Serial to init if necessary.

Maybe it’s overkill but with debug disabled it’s all gone anyway.

Your solution is fine too, very concise and unless the compiler is set to None optimizations, it should remove the impossible-if code entirely, so it doesn’t matter if you do it at the preprocessor level or not.

Thanks for following up on this! Highly appreciated