#define vs. const variable

I've read somewhere that it's better to use a const variable rather than use #define. However, I also read that #define uses no program memory or RAM after compiling. So now I'm not sure which is better to use, or if one is better than the other in different cases. If any Admins or God Members read this, please post a reply I'm very interested hearing the facts.

Thanks, DigitalJohnson

2 Likes

Which is better depends on what you are defining.

The #define construct is a compiler pre-processor directive. The value is substituted for the name, wherever the name occurs in the source code. So, something like:

#define ledPin 13
digitalWrite(ledPin, HIGH);

looks to the compiler just like

digitalWrite(13, HIGH);

Whether that is better than

const int ledPin=13;
digitalWrite(ledPin, HIGH);

or not would require looking at the assembly code that was produced, which I am not qualified to do.

There are advantages to using the const structure.

if(ledPin = 13)
{
}

will fail to compile, because ledPin is const. It can't be changed after being assigned a value, so the compiler will display an error message that will, hopefully) tell you that you should have used ==.

If you are doing something like this:

#define HelloMessage "Welcome to the wonderful world of Arduino. Enjoy your stay"

and using HelloMessage in 25 places, you'll get 25 copies of the string in SRAM. A const char array would result in only one copy.

On the other hand, there are conventions (not always followed) that call for variable names to be camelCase, and #define names to be ALLCAPITALLETTERS, so that constants can be recognized as such.

2 Likes

However, I also read that #define uses no program memory

That's virtually impossible.
Imagine you've "#define"d a register number.
At some point, the processor has to load that number to access the register, so it has to come from somewhere, either as an immediate constant as part of an instruction, or from a program memory table.

However, a lot depends on the processor architecture and the compiler, and what the use of the constant is.

and using HelloMessage in 25 places, you'll get 25 copies of the string in SRAM

The optimizer will solve with this for you. Strings are pooled and references will be rearranged so that only one copy is needed.

I understand a little better now. As far as checking the assembly code to see which is better, I, like PaulS, am not qualified to do. :-/

Thanx again for the info. guys,
DigitalJohnson

DJ,

The #define pre-dates const declarations. Since #define just causes a string substitution, just as if you got in an editor and said:

Replace all instances of LEDPIN with 3

This replacement is done by the pre-processor, and all happens before the compiler ever looks at your code. In other words, the compiler never sees "LEDPIN". It only sees "3".

The advantage to using #define is that you can do some tricky things with strings. For example, you can assemble a string out of several #define definitions.

You can also create code with a #define:

#define FOREVER for( ; ; )

FOREVER
  {
  if (serial.available() > 0)
    ...
  }

You can also have a #define that takes arguments and substitutes those arguments into its output.

The disadvantage to #define is that since the compiler doesn't see your original code, it doesn't have any idea what you're trying to do. This is where const shines.

The compiler handles const declarations. It knows, and equally importantly, someone reading your program knows, the type of constant you're declaring.

My rule is to use const wherever I can, and use #define everywhere else. Whichever you use, you can eliminate "magic numbers" so that you never leave someone reading your code wondering what you are trying to do. Would you rather read this:get_sensor_value(3.25);

or this:

const float sensor_calibration = 3.25;    // determined by experimenting

get_sensor_value(sensor_calibration);

Regards,

-Mike

2 Likes

The worst thing about #define:

//somewhere in a library out of your control
#define DEBUG

//somewhere in your code:
const boolean DEBUG = true; // ERROR because of the other library :[

[edit]And yes, I know you can #undef and then #define it again, but why not use consts?[/edit]

I think this example illustrates why the convention of all upper case letters in names for things that are #defined, and mixed case for everything else came into being.

Shame on people that don't follow the convention.

For me, all caps is a mental cue for a constant, not the way the constant is declared.

And, why #define anything and reserve that identifier? The define directive should be use d for what it was designed to do, namely preprocessing code.

The best thing about the preprocessor:

#ifdef DEBUG
  #define DEBUG_PRINT(x) Serial.println(x)
#else
  #define DEBUG_PRINT(x)
#endif

and it's like.

#defines are not type checked by the preprocessor.
(so the example macro might get into trouble if a wrong type is fed in)

consts, are type checked by the compiler.

consts can also be scoped properly.

#defines are not type checked by the preprocessor.
(so the example macro might get into trouble if a wrong type is fed in)

True, but the compiler will type check the resulting "processed" code.

The problem that can arise is the compiler is reporting an error on code that the user did not create directly, so the error can be quite confusing. This is generally given as the reason const is better than #define.

-j

a few wks ago there was a post from someone trying to overload a function. one version took an int argument. the other took a boolean. He called the function with an argument 'true'.

It turns out true and false are
#define true 1
#define false 0

so the int version gets chosen.

if it were:
const boolean true = 1; the compiler would have been able to choose the intended version of the fcn.

if it were:
const boolean true = 1; the compiler would have been able to choose the intended version of the fcn.

... as would be the case also if you define true and false as:

#define true (boolean)1
#define false (boolean)0

Well, it's been more than 6 months since I last read this post and I am learning a lot. :sweat_smile: I'm learning more on the use of preprocessor commands, and #defining macros. Apparently you can do quite a few very useful things with #define. It's not just for defining constants! :astonished: I've learned so much from this forum as well as the playground. Another thing I have found to be an invaluable method for learning how to achive specific tasks, and even better, tasks I had not even thought of, is to open and study all the library files. Most of them are very well commented, explaining every step of the way through the code. Even if you don't understand some of the concepts being used when you first read through it, you will remember seeing it before and now it makes sense.

Once again, thanks to everyone on this forum for their contributions :D,
DJ

1 Like