When is Variadic Macro argument "calculated"?

I’d like to know when is a variadic macro argument “calculated”. My example is:

#define ERROR_LOGGING false
#define LOG_ERROR(…) { if (ERROR_LOGGING) Serial.println(VA_ARGS); }

Now, when i use:
LOG_ERROR(“foo” + String(someInt) + “bar”);

Does the conversion of “someInt” to String (and concatenation of the strings) really happen or is the passed argument calculated after it’s actually used?

Another approach I came up with is:
#define ERROR_LOGGING false
#if ERROR_LOGGING
#define LOG_ERROR(…) { Serial.println(VA_ARGS); }
#else
#define LOG_ERROR(…) {}
#endif

Does any of these approches end up with zero performance hit when LOG_ERROR is called while ERROR_LOGGING is defined as false?
Or is there another way to do this?

Compile some code with logging on/off.
The size of the resulting code reported by the IDE will tell you if any code was generated.

But I suspect either of these approaches will have zero overhead with logging off.

Well I feel dumb now. Thank you. Don't know why I didn't think of that.

The second approach seems to be cleaner when compiled. (And actually exactly the same as if I don't use LOG_ERROR at all) The first approach is a little bit bigger. Guess that makes sense since there is the "if" that still needs to be executed.